Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this is not a people problem. this is a post-COVID issue with the new generation…
ytr_UgxZYT29P…
G
I was excited about ChatGPT too. However, ChatGPT reminds me of a "magic 8 ball…
ytr_UgzKM7fvA…
G
AI generations looks so creepy and unnatural. If the studios prevail, I am serio…
ytr_UgyW6Nsxh…
G
The truth is in between. We will need to use AI and get better with it. We will …
ytc_UgwK8RGRi…
G
the ai vs real art argument is so dumb idc. Like trust me, if you’re a good arti…
ytc_UgyVGXa7x…
G
These bastards are trying to take good driving jobs away. I hope Waymo continue…
ytc_UgxeBdRlM…
G
Ben appears to have zero concept of the ramifications this development presents.…
ytc_Ugw0zpnpk…
G
The other side of the looking glass is the other side of your phone, tablet, t.v…
ytc_Ugw_jnUo1…
Comment
The video discusses the concept of Artificial General Intelligence (AGI), which refers to AI systems that have a level of intelligence comparable to or exceeding that of humans. Professor Stuart Russell highlights concerns about the rapid advancement of AI and the potential risks it poses to humanity, suggesting that AGI could surpass human intelligence soon.
How do you think we can ensure that AI systems are developed with our best interests in mind?
youtube
AI Governance
2025-12-24T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxPQsdyAluv7VMoCrR4AaABAg.AR69GX83V25AR6pUTt-qOn","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwOcmcVu2iPdHdlvCN4AaABAg.AR5xe_lAh6IAR6qIoS7dFJ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugzact06OBboW8dqhJx4AaABAg.AR5o64wZkIbAR6r7UOX0_A","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyE_IycZrfNj7BkS7d4AaABAg.AR55LiodjTMAR6tE7XShKO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyoDhTaK6EUKRCYXr94AaABAg.AR4xL0I2opMAR6ti5FNiJO","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzaNVAyY0y-DJD6n7V4AaABAg.AR4x4RPSmaQAR6u_70y6po","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzJdW7OgJi0-Y3qAbJ4AaABAg.AR4uVOg4Ih5AR6v0nnDcRf","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyTRYHlxOAX69X5xCx4AaABAg.AR4t0NRFVlfAR6vjSzNSEF","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugy9vuvVoeJfNPW6dCN4AaABAg.AR4rrOfAdBhAR6wS7YvCBl","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwhiyj6dEolU9bUaix4AaABAg.AR4pq4vfXyVAR6xHWQ4Yfd","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]