Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you record a video but you don't post it on youtube till 5 years later,can yo…
ytc_UgzxNGnv3…
G
(Sorry for going on; I wonder if anyone even reads all of it =) ) One thing I th…
ytr_UgzolWtZT…
G
we'll figure it out when we get there.. "when AI thought human is a threat to th…
ytc_UgyS3-Zrg…
G
You are using what I call the "cotton gin analogy" where a new technology create…
ytc_UgzRTSbzs…
G
SOS to Masters of the universe of this galaxy please eliminate this Artificial i…
ytc_UgwR_CEYk…
G
Oh so they're being financed I would look into the HHS for those funds being giv…
ytc_Ugxp1T4em…
G
I saw someone who said “Why should I bother reading something written by AI if n…
ytc_UgwjrbMfw…
G
Tesla's self-driving autopilot FSD has already saved hundreds of lives and will …
ytc_UgyoZhy34…
Comment
The video discusses the potential risks of AGI and how advanced AI could pose extinction threats if not controlled properly. While some may believe humanity will be fine, experts like Stuart Russell warn that it's essential to prioritize safety in AI development. Given these concerns, what steps do you think should be taken to ensure that AGI benefits humanity rather than harms it?
youtube
AI Governance
2025-12-24T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugy4T5UmRxtAqFNsuLh4AaABAg.AR4eVOubbApAR6yA87uKL1","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugwg8JX0OF3QY5D3TQl4AaABAg.AR4dVGMqOtNAR6yeo8pz6N","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwZAMsl34DUaMr9JnB4AaABAg.AR4T-cFn_l7AR6zKhRGsZ0","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxMiRa84AqKg1o9LH14AaABAg.AR4EzBUqcWxAR7-uY8wYqA","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzsVMLmwaMXcOJRqbJ4AaABAg.AR4CofgPducAR70WHKW6UI","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzVRWFxrX53EdkF5KZ4AaABAg.AR4ADqaGHSHAR717tO8Nip","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzoaI5hHkB34Cafyc14AaABAg.AR3xzM8Hmd0AR72Dg-09XW","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugzb2AFfMfIczZsuA0l4AaABAg.AR3rxHiyXu5AR72j8nO2rp","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgyepHd5dAiNNR70POp4AaABAg.AR3nJcBxn8KAR73_-aCD4x","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzfpPRmuOnv31pHTNx4AaABAg.AR3Xf6P9gZrAR7B9rnMRM8","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]