Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's why you implement policy to deal with automation. Humanity could phase ou…
ytc_UgwhO-KfU…
G
AI does not need consciousn to kill us all. AI does not to be self aware to kil…
ytc_UgxHTTZZH…
G
it's interesting if we become the forerunners and creators to what could be the …
ytc_UgzGJ3_9c…
G
We are all participants, because as you interact with AI it is learning. Be ki…
ytc_UgweIOcZT…
G
I actually want AI to overrule humans and lead this planet forward, with or with…
ytc_UgyhU0Esw…
G
That robot knew what he was doing, but he just made it look like an accident.…
ytc_UgyX-9ETQ…
G
You are so dump! You didn't said anything about the driver of the Tesla which cr…
ytc_UgyPndd7u…
G
Anybody else notice that when this robot is angry or makes a mad face it looks a…
ytc_UgygKVYWk…
Comment
🤔 >divine intelligence (God) > artificial intelligence> natural intelligence( nature, man) > divine intelligence> artificial intelligence > ...>
There is no end, there is only the whole thing 🤔
youtube
AI Governance
2024-11-23T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugyq9Qzyj1sUeAIYsBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAFSVinb97o7r4TmF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgymunjQEqXTWAaELr94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNzcVgQNL_-tn4fKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwiBETzcN_ZGnsmwOh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwREs1EBBFLKt6VdKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmEQLWg9c-qwLYef14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwqJLhRFcJ0qX0Ex5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybY24isA9BeaSbHrN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzykIdR-VPX7ZmkVX54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]