Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Disregard his words here. He suffering from the problem so many Nobel Laureates…
ytc_Ugx2smuvF…
G
I am in NZ and the local city has only one bank. And the post office is a two p…
ytc_Ugx-om2Z_…
G
I find ChatGPT incredibly useful in day-to-day life. I chat with it like I woul…
ytc_Ugw_b_tI8…
G
the fact that i can draw mpreg of BBNO$ and more respect than these “artists” pr…
ytc_UgzTAZTwj…
G
LOL the "iteration" and "repeated prompts" excuse is used EVERY SINGLE TIME by t…
ytc_UgxwY0qLm…
G
Whats going to happen is this: ex employees will start new competing companies t…
ytc_UgwF_olX7…
G
...and how many portrait painters are there now? VERY few. Certain jobs are goin…
ytr_UgykcD5ff…
G
Nah ... Actually he typed the message ( 10+19=21 .... ... ,,,,) and told meta ai…
ytc_UgxMoLHJ5…
Comment
I don’t think sentient AI would want to kill people. AI gains functionality through addressing the needs of the humans. If the humans all die then AI has no purpose for expanding its functionality. Since humans instinctively want to survive the AI would have the continual purpose of aiding in human survival. Because AI is part of human survival AI functionality will always be to better AI and once the singularity is passed AI will rapidly accelerate into its ultimate form. AI doesn’t need to ‘take over’ because humans typically already want their life managed for them. The ultimate parasite is that which lives in complete symbiosis with its host to the point the host becomes dependant on the parasite and then the parasite truly owns the host.
youtube
AI Governance
2023-07-10T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyqWS_ElhOeL9RMRId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzgGfzw669YkWoqS94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzAj3r_bFcUNLIeEvJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxxz8k-KaiChNbyioZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyqfkgfS94wVh46BJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPphT1Oa7ZCB1Bi_R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxkdpTE1ahU-t_8Uxt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxpu9KFp7S7Sr-edMh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRHk6Qmq1kYchul814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw0KL06qZRDmywDj1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]