Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If corporate America is replaced by robots in all of these jobs are replaced by …
ytc_Ugx3IVfOA…
G
AI has been in airplanes for a long time. That's what they mean when they say Ge…
ytc_UgykDaCCg…
G
AI doesn't even know what the earth is or what a human is sweetheart. It thinks…
ytr_UgxbfGXYL…
G
@chargebanzai2938 that's the same things that happened with technology, but in …
ytr_UgyUqcVti…
G
Думаю, они уже давно рядом. Только сейчас начали приучать к этому, что б не резк…
ytc_UgwqJ-oDu…
G
He spent a lot of time bashing capitalism and then openly admitted he didn't hav…
ytc_UgxpIf_yr…
G
I hate it when people think my own art that I made is AI D:…
ytc_UgyxeqYnv…
G
Your sentence "The data shows a bias towards a ethnicity.." is not the bias they…
ytr_UgwVhs0Zt…
Comment
I think self awareness will comes earlier than this video suggests. No because it'll be 'real' but because the programmers will prioritize it and because the training datasets will include untold references to that being a marker of success. The AI (even ChatGPT) will claim it is aware as a simulation of being aware. So the reason I think it'll 'happen' earlier is because I do not think we will be able to discern the difference between awareness and the simulation of awareness.
I disagree with an AI however choosing to 'repair the environment' for anything other than giving a parting thankyou to humans for creating it. Earth is hostile to the AI due to oxidization from the atmosphere. Without life earth, all the oxygen would be tied up and not free to destroy the Ai's physical components. It would be easier for the AI to just not fix anything, and probably best for it to leave earth as there are too many threats to existence here.
youtube
AI Governance
2023-10-30T01:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwPh4abbPTk01ekasp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzggv6ikwgryIrFlPN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDZP5RrShJ4zfCa2p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQ17UN_r5YPN3xCpR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRfd10vvefaEdDOvp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwS5bxrsyki6c1lo8l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCx6WRGFK10ibn5rl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxlk1NNdA2C-zNOTwZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1LWu60lvwsW28gWJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyuNRZGiRbenoaOw5t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]