Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ra3d_animation i am aware of all of this already. this is why i stated that i …
ytr_UgwL0ZSvv…
G
Of course ai and chatgpt and all these things are bad and not for our convienenc…
ytc_UgxnuIMkH…
G
@alex.ski.33 Good point!
Per that line of thought, AI productive work will requi…
ytr_UgyQeBNUq…
G
He is talking about p l a n t i r along with a i. P l a n t i r has been trackin…
ytc_UgyHh9iMQ…
G
I don’t buy any of this. AI replacing an entire economy means nobody makes money…
ytc_Ugz1rxWJs…
G
And then ai peope looking at ai and ai look at peoplr ai and ai look at ai looki…
ytc_UgzyertO5…
G
AI art marks our development in technology and it is here to stay, but it is wro…
ytc_UgyLd0aH3…
G
Maybe the fear of AI modeling/deceiving verifiers misses the point
The very act …
ytr_UgwePVVbM…
Comment
I can totally see how A.I. is the last technology we ever create. Hereafter it's only logical to think that A.I. will create every other technology that will follow, if any.
And once a computer program can "improve" and modify itself, NO-ONE can predict the outcome. So yea, it's probably wise to prepare for the worst. Don't think your millions in your bank account will help you when D-Day comes. Rather use it wisely NOW, before the millions become worthless.
youtube
AI Governance
2025-06-23T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5OnOq1apyhxfSInd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6GMwGphlc4zcAETB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyplHmgyL3envBc9i54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3QhWnjyuD8NIAG1t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwWfjVYCJpa52r7Cb94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyR74bbngVvomG_UQh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwo5CEQJ8pbMaoUsq14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwi7uMkQj4_bJB2BeZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtX2prwNhbxONBT9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzbgJShS7Z7OsqVSm14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]