Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:26:03 "we don't have a model for a functioning society where the majority of p…
ytc_UgyzYVs_g…
G
Oh yes, AI just good about giving you its top answer, but just like a dictionary…
ytc_UgxCBxW-r…
G
If AI wants to take credit for ridding the world of humanity, it better get a mo…
ytc_UgwZWJVoD…
G
can you define "rich internal representation"? Do you just mean that the prompt …
ytr_Ugyjwh7Oa…
G
I feel like 100% AI code is okay BUT they must own the code, if it breaks at 3am…
rdc_ohzwn1s
G
I mean, ai is going to start feeding off of itself. It will eventually poison it…
ytc_UgwpJkzoi…
G
Cameras and even those sensors in automated hand dryers have always been geared …
ytc_Ugyuo4ne7…
G
I googled how many W’s were in Zimbabwe and the AI overview said that there were…
ytc_UgzobXi0c…
Comment
The video discusses the competitive nature of the AI race and mentions that many leaders acknowledge the risks of AGI, yet they continue to push forward. It highlights how financial incentives often outstrip concerns for safety and regulation, which may overshadow discussions about global competition, including China.
What do you think could be the implications of this competition on global safety regulations for AI?
youtube
AI Governance
2025-12-24T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxPQsdyAluv7VMoCrR4AaABAg.AR69GX83V25AR6pUTt-qOn","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwOcmcVu2iPdHdlvCN4AaABAg.AR5xe_lAh6IAR6qIoS7dFJ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugzact06OBboW8dqhJx4AaABAg.AR5o64wZkIbAR6r7UOX0_A","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyE_IycZrfNj7BkS7d4AaABAg.AR55LiodjTMAR6tE7XShKO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyoDhTaK6EUKRCYXr94AaABAg.AR4xL0I2opMAR6ti5FNiJO","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzaNVAyY0y-DJD6n7V4AaABAg.AR4x4RPSmaQAR6u_70y6po","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzJdW7OgJi0-Y3qAbJ4AaABAg.AR4uVOg4Ih5AR6v0nnDcRf","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyTRYHlxOAX69X5xCx4AaABAg.AR4t0NRFVlfAR6vjSzNSEF","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugy9vuvVoeJfNPW6dCN4AaABAg.AR4rrOfAdBhAR6wS7YvCBl","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwhiyj6dEolU9bUaix4AaABAg.AR4pq4vfXyVAR6xHWQ4Yfd","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]