Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It was not a accident it was planing it if they sent a robot free it would be th…
ytc_UgwcyB92_…
G
There is kind of vague spectrum of definition of AI
from the learning AI via en…
ytc_Ugyz9AZsN…
G
@Синдромпоискаглубиннойбессмысл it is physically impossible for ai to have feeli…
ytr_Ugy-fGRWE…
G
Diagnosis is an excercise of recognizing paterns of symptoms and crossrefrencing…
ytc_UgxEqduiR…
G
You know, you didn't even better than AI, I never actually didn't hate live on m…
ytc_UgwWDxegv…
G
😂😂😂😂 you lost your clients on fiverr for your shitty fanarts to an ai artist ? 🤣…
ytr_UgwgknU-J…
G
Ban Driverless vehicles. I propose "The Kit Kat Act". It could have been a child…
ytc_UgzkBgHzJ…
G
Ai is so soulless. There's no soul in it at all. My dad actually works for makin…
ytc_Ugx_2XmJ4…
Comment
It isn't if the technology falls into the wrong hands, but when. As it stands right now hackers can take over an entire person's identity and destroy their life. Pandora's box has already been opened. Trying to stop AI at this point is like trying to stop inflation. It will never happen. Pretty much everything is already controlled by digital online services and all it would take is the wrong person to create a virus to destroy the entire foundation of our country. The idea that a computer system could create it and unleash it is terrifying because AI are constantly learning and evolving.
youtube
AI Governance
2025-11-25T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwwyzgW0yXmkeWbb914AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxZcXF5FtAiw91zxd94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6GAHkBffzCdCRF2B4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwnV33-q69KewrDCFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUijqbIuzsUMNJC114AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwO7vacJTim7h4Z1QJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj5LUQk-1C-Gfnlo14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxP2L5GL5X-khP_2Nt4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8Nfj7jzAz-XphxxV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIHKmR0m-geLiHeRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]