Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For many people, they don't think anyway. I think that AI models make me learn …
rdc_mah1udu
G
But yet he trusts ai to drive a car...
I mean seriously he would have to trust i…
ytc_UgzIo3K0D…
G
I can see where you're coming from! Sophia’s response about always learning and …
ytr_UgyBIyBKL…
G
Yeah, I really tried to give a shit but I don't care. Automation is in every ind…
ytc_Ugy6V0dLV…
G
An interesting follow-up would be to see what an AI that is trained on one perso…
rdc_f515nme
G
There will be time when everything is controlled by Ai and humans can live their…
ytr_UgxVFdbor…
G
I hope AI takes most people’s jobs (Not Artistic) and everybody gets a good Univ…
ytc_UgyWftlY5…
G
There is no reason to advance AI we should only advance Humanity definitely our…
ytc_UgzxKukGc…
Comment
3 billion human lives ended on August 29th, 2032. The survivors of the nuclear fire called the war Judgment Day.
They lived only to face a new nightmare, the war against the Machines...
chatGPT, the computer which controlled the machines, sent two terminators back through time. Their mission: to destroy the leader of the human Resistance... Elon Musk. my son.
The first terminator was programmed to strike at me, in the year 1970... before Elon was born.
It failed.
The second was set to strike at Elon himself, when he was still a child.
As before, the Resistance was able to send a lone warrior. A protector for Elon. It was just a question of which one of them would reach him first...
youtube
AI Governance
2024-05-05T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgygHOY4OoO3K0q-O_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwI-0nte8IHC2LqV3t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugww8otnNsnHTzrT-4J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWxS297z5PRVlQJIl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwwun-sRZ7l8oEHhhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyobUJZaDcrmmH6LlR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzOzkJ0kOmEf5HbbBp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKFAnOXsRve4PWYJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx2RAzKKwNSEA3a4tV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWXgSB_ljPEOM4YSB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]