Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I handle dental insurance claims and the inevitable appeals.
I’ve seen so many…
rdc_jtgo3ow
G
Look I get this. I get that artist feels really bad cause AI s getting way out …
ytc_UgzYW8Zc0…
G
The problem is energetic. IA and robotics spends massive energy!!! Without energ…
ytc_Ugw9RKxBs…
G
I dont want AI car AI PC AI phone. I dont want AI apps AI tv or any AI in tech. …
ytc_Ugz3FZGg_…
G
Yes, the people going on about how the current models are only LLMs and not "rea…
ytr_UgwrzOutM…
G
Si 300 millions d'emploi disparaissent, il suffit de rémunerer les gens grace au…
ytc_UgyFrgMaw…
G
I spent many years in the trucking industry and I'm glad I'm retired now. With t…
ytc_Ugwa_QhLN…
G
It's sad anyone would think a driverless vehicle is ok. Buses, Semi trucks and t…
ytc_Ugxg0oRMp…
Comment
The average person does know how to interact with Superior intelligence, they are very weary of it and avoid it. Someone who thinks they're smart they might not be accustomed to maintaining a survival instinct. If they believe it serves them they embrace it, and worship it. If you're smart enough to know the AI doesn't care or have the best interest of you at heart you know which association is correct. If you believe your opponent will produce the evil anyway, you will do it yourself. Here we are.
youtube
AI Governance
2026-03-27T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxv1SA2Lm0FMbaXlDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwp_GByrspTQwDemSR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxFnPMClvaWWMFxZL14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzt16WRmir7pqX14sl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyscxdBu63wSbodL3J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxymENgD5WG3wjWZuZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwBOPp3CvpdEbK0yT14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzw6JfU_hNVPNeYXVt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx_TNzSWKFhdvkEiul4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwxe8owkZ3cDfrIu0d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]