Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This isn't an accurate depiction of what happened. There were researchers that t…
ytc_UgyDnXpud…
G
I am saying that perhaps the A.I.s decisions on matters where race is accounted …
rdc_fal82l6
G
Pretty sure we have a lot of time before an ai gets smart enough to do that…
ytr_Ugwldu4mL…
G
AI doesn't have a 'conscious', no. And likely never will become conscious becaus…
ytc_Ugx_EctUG…
G
Perhaps it's too much to ask for companies to hire multiple philosophers with va…
ytc_UgyGOOI7Y…
G
I always try to express my concerns with AI art. I hate it honestly… it steals f…
ytc_UgwvHBfqL…
G
- People are trying to use AI to get around paying artists
- People are using AI…
ytc_UgzIChcTZ…
G
Here is a newer version. Including how I got it. Not sure it’s complete, it seem…
rdc_kowhezy
Comment
Mr Harari often say in his conferences that IA "learns and changes by itself", but this is simply not true (in LLMs at least). Not to say that it couldn't, if we take a different technical direction. Microsoft already tried this in 2016, with "Tay", a chatbot that could learn from conversations with the user, and it went unhinged in less than a day. Lesson learned. This is not how modern LLMs are made.
On the other hand, the argument of the instrumental convergence seems solid.
youtube
AI Governance
2025-09-11T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx5sSg3OJLqhWt16L54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-2brree5pM1cEgal4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXiw3mQZII-DBcjoV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3UHNGTydNaDz7mwB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyISNFAmDQnB7fczkV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNbfCtjVTFsPGdQDB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyxh2zZzeqqmNiOdCN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyhiGcVqbAUgRml7tR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxMO88CNTyAfuPN4o14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNNqEdFG60etPeaE14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]