Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This isn't the first autonomous car fatality. An autonomous Tesla hit the broad …
ytc_UgwRzW9LL…
G
10:28 That's not really true. It really is difficult to get something specific, …
ytc_UgxJYrVRj…
G
Actually AI can't figure out how to provide working software for the local tax a…
ytc_UgyACIS1n…
G
Ai isn't some sort of malevolent hateful spirit, it's man made and man should be…
ytc_UgwVMDlxm…
G
At 5:01 the man Robot 🤖 said what are you talking about I think gargle ? Or whoe…
ytc_UgxArRWSM…
G
True. And most of the Britons on the continent couldn't vote in the brexit poll,…
rdc_fwhp3ql
G
I draw, I think you have to be somwhat creative to make an ai art.
But it’s more…
ytc_UgyJNPTaE…
G
Do not, I swear even if it’s an ai it memorizes politeness then it will always p…
ytr_Ugym-W0GK…
Comment
The problem with AI is that sooner or later a true AI (artificial lifeform)will arrive and the AI may at some point be the bad actor. That's what intellectuals are concerned about. Imagine something exponentially smarter and faster than the smartest human with instant access to all the data in the world.
youtube
2023-05-09T02:1…
♥ 79
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz2c4iQ354_BAyhajl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4E_J81BLqUMhFm_x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyXdHsAWxCoQWq6S1h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOezE-eZEluoRFjzZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy8zGsrm8L456mhn3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwhm8pufpjficdPZqx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz0uVCA6naBuHsuOup4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4yEKr0UwOz2Z4CuF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwVPnaO39g2H1P-q4h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwgb-54GW8EP3k9CUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]