Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Am missing the point somehow? He asked if ChatGPT "just took over customer servi…
ytc_UgwU2GRsE…
G
Why are people upset with amazon over this? I can't imagine why a red blooded Am…
ytc_UgyEdr-ck…
G
TLDW
Climate change sucks. AI makes decisions about you without you knowing. Yo…
ytc_UgxkOAdJ7…
G
I think another good comparison to make between generative ai and cooking is abo…
ytc_Ugxd4CLvU…
G
we have the agree that ai artist are the ones who haven't touched the paper…
ytc_UgwrdLWA2…
G
Furina's outfit in AI version aren't matched with the original one made by human…
ytc_UgwPrpHZB…
G
If AI have a body will we fly to it location and attack it together?…
ytc_UgwKIwy6s…
G
People forget we still have ape emotions. The depravity of humans is a result of…
ytc_Ugxwdv9Yo…
Comment
1:24:19 "There is still a chance that we can figure out how to develop an AI that won't want to take over from us". I think that this is slightly wrong: if an AI is way more capable, it's very natural that it will be given more responsibilities and eventually take over. The way to do it is have the AI love us and care for us in that process.
youtube
AI Governance
2025-06-16T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwmrm_v7xJWVqG3ogd4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwhHJfZYvUNi9ttUGp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwtjN8i_dTUh_D-vrl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyCeOWyN8dixl8xRl54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZO9NdPBMtQGvT3jp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxK-HCZi8QJsrw_Ce14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxLuCmgRnV1WArzUTp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYilfM7LRQp2m0sSN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxYDiRrPCZVK16zIft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyaMlPCTpC2DANnBjt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]