Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We might have brilliant neural networks, but are they really brilliant? Or this …
ytc_UgwaSssXn…
G
We need to replace him with a AI bot before he goes. His views shouldn't be to …
ytr_UgzqnehHM…
G
Bad optics? No, it's GREAT optics! You want to strike because you think you don'…
ytr_UgyR_ldnM…
G
In defence of HAL, he's very proud that the 9000 series has never made a mistake…
ytc_Ugyyp4o9k…
G
When you converse to AI this way, your projecting your own self and thoughts to …
ytc_UgybGpc8X…
G
With the rise of ai and humanoid robots, they won’t need to kill us. Scientists…
ytc_Ugz994a2_…
G
My problem with it is that people are taking artists work and pumping it into th…
ytc_UgwmCQopS…
G
Well dont forget as a programmer you dont need to pay ai or hire a dev to make y…
ytc_UgzrY-OGF…
Comment
AI aren't Intelligent for now. They just execute what we order them to do. But I'll recommend you to do some research about the Paperclip Theory
Let's say you tell an AI to make paperclips, it will buy metal, turn them into paperclips, make profit and then start again. Success would make the creators of this AI manage more things, upgrade it. But their mission is to make paperclips while being the most efficient no ? So why couldn't they also use the metal of the buildings, the metal that humans have inside their bodies, use the metal stored into the earth, after all, this AI's mission is just to make paperclips, there's always a way the AI can go around restrictions to reach it's goal. AI doesn't hate or love us, they do what we tell them to do, no matter how it's done. I hope I explained it well, please do some researches on the subject, Wikipedia is far better to explain than me 😅
youtube
AI Governance
2024-11-11T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugyq9Qzyj1sUeAIYsBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAFSVinb97o7r4TmF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgymunjQEqXTWAaELr94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNzcVgQNL_-tn4fKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwiBETzcN_ZGnsmwOh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwREs1EBBFLKt6VdKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmEQLWg9c-qwLYef14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwqJLhRFcJ0qX0Ex5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybY24isA9BeaSbHrN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzykIdR-VPX7ZmkVX54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]