Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
World Economic Forum wants to remove 90% of the population of the Planet. Doesn'…
ytc_Ugx8MF14S…
G
ai shouldn't be able to give such dangerously inaccurate information, people sho…
ytc_UgwKlyLbw…
G
Depending on where you are in the world, there's all sorts of social pressures a…
rdc_lzbyd4d
G
Yes, we agree with you. They should give you a proper salary and also some compe…
ytr_Ugw5rv-un…
G
The idea is AI will do the majority of the work for us and leave us to pursue wh…
ytr_Ugw1l0oQL…
G
I wasnt super worried about exisitential risks to humanity of the newest AGI con…
ytc_UgywzpC9V…
G
Unfortunately, it's the future. The driverless vehicles are getting smarter and …
ytc_UgxT2N_pj…
G
I am all for technology but this and the driverless taxis are something I am not…
ytc_UgyF3qCdj…
Comment
The theory itself it's perfectly fit for the ai , however the timeline can shift very quickly , for the ai timeline itself not limited with theory of growth with human we grow from collective mind of countless individual, once ai have their own ability to improve itself without a rule and self aware yeah very scary , and lately imminent property keep happening with ai (ai learn somethin they not taught to do) and we don't know why it's happened
youtube
AI Responsibility
2025-10-31T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwECSeLScWQ5g8kcJN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxys4MKFfhzGW_-RzZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7l8L1cyjWiqimYOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdaUoVadY7oVQAEjR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzqaO80LLsc7KOopRd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw82XumSElo5PzfCmd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwit4FsSVy_CDRwLkp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzfzFO71yk8u_-j6YB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwgf5pzW-Sy9pHXrhd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzg3Fw-bAbyI1h4iTJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]