Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Skynet became self aware 8/29/1997 at 2:14am (ask Siri) We’re still here. I agre…
ytc_UgxBrVoI6…
G
This is the exact same thing as the Tower of Babel and what they were doing to b…
ytc_UgwaCHWJ4…
G
Since AI is a work that required access to the whole of accumulated human knowle…
ytc_UgzFt7iLm…
G
I wish you would just state the obvious, that "AI" in its current form is danger…
ytc_UgwjF2NyU…
G
I hope this self-driving garbage doesn't come to Europe. Driving isn't just abou…
ytc_UgwhOoI_t…
G
I built up and fine tuned a company’s illustration branding from the ground up f…
ytc_Ugxy3Wewn…
G
Saying you made art the ai made it literally AI taking different pieces of art f…
ytc_UgxZaMA0H…
G
No its time for Employee Bill Of RIGHTS!
Companies not in financial crisis do …
ytc_Ugzed7zhe…
Comment
It seems like you're expressing concern about humanity creating powerful technologies like nuclear weapons and now artificial intelligence without fully considering the risks. The video discusses how this pursuit can lead to extinction-level threats if not managed properly.
What do you think can be done to ensure that future advancements in AI prioritize safety and human interests?
youtube
AI Governance
2025-12-24T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxPQsdyAluv7VMoCrR4AaABAg.AR69GX83V25AR6pUTt-qOn","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwOcmcVu2iPdHdlvCN4AaABAg.AR5xe_lAh6IAR6qIoS7dFJ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugzact06OBboW8dqhJx4AaABAg.AR5o64wZkIbAR6r7UOX0_A","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyE_IycZrfNj7BkS7d4AaABAg.AR55LiodjTMAR6tE7XShKO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyoDhTaK6EUKRCYXr94AaABAg.AR4xL0I2opMAR6ti5FNiJO","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzaNVAyY0y-DJD6n7V4AaABAg.AR4x4RPSmaQAR6u_70y6po","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzJdW7OgJi0-Y3qAbJ4AaABAg.AR4uVOg4Ih5AR6v0nnDcRf","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyTRYHlxOAX69X5xCx4AaABAg.AR4t0NRFVlfAR6vjSzNSEF","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugy9vuvVoeJfNPW6dCN4AaABAg.AR4rrOfAdBhAR6wS7YvCBl","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwhiyj6dEolU9bUaix4AaABAg.AR4pq4vfXyVAR6xHWQ4Yfd","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]