Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So here’s the questions… even if you must have a driver you know the company won…
ytc_UgzEd5S6-…
G
To be fair, the first examples are still a picture of how an advance AI is learn…
ytc_UgwvuWBKl…
G
As much as I disagree with AI art I promise you the seller doesn't care, he's ri…
ytc_Ugz0Rxa2G…
G
AI might forget that a bullet to the head does the same to them as it does for u…
ytc_UgwGC80sJ…
G
Recent reports revealed a 2021 injury report that claims a robot designed to mov…
ytc_UgyYO2IjF…
G
What's funny to me is that humans still don't want to admit that we are artifici…
ytc_UgyaZPwyD…
G
I'm in no way support AI "art" but for fun it's fine I think, example: kermit th…
ytc_UgzIpmh8Q…
G
It's common sense
Give a robot data with racial bias
And it will become a robot …
ytc_UgwXYr3dO…
Comment
It is 10 years in the future. We are sitting in our "homo sapien rest apparatuses" watching the new daily video courtesy of our digital overlords. The video shows this interview, specifically where Elon says "If we wait until a disaster happens before regulating AI, it'll be too late." And then the artificial face of our digital god comes on screen and just laughs and says "You probably should have listened huh?"
youtube
AI Governance
2023-04-19T07:4…
♥ 37
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugyv5XTkm8cpAsLyOQV4AaABAg.9of9Z30NR-y9ofCgaROAR8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxwEwKvHZWqs9xhTcx4AaABAg.9of8Wd7S7so9ofBqiE5iW_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxuWxoVAOVFsqeL4IF4AaABAg.9of1cngrccM9ofFL0NRtSi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxiVymUX2d9jIazTat4AaABAg.9of-OHSTp0G9of-ahfysxs","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgzupSCo7I64eJJyYC94AaABAg.9oey7BcdFqg9ohN2MF13w5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzrv00ftucrQfUlew14AaABAg.9oesogIyvS39oguIMXDK0B","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugzrv00ftucrQfUlew14AaABAg.9oesogIyvS39ogzFqAEHmL","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgwXl6sKkIDjWYEuYlB4AaABAg.9oer-aiMFT79ofRPDT50Ww","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwXl6sKkIDjWYEuYlB4AaABAg.9oer-aiMFT79ofTb4ewXsr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz4M_cWd2tZ0mgqIN14AaABAg.9oepPa1bY8g9ofwW2PcyeS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]