Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if AI was involved with the co-vid fiasco, the bs reports about million…
ytc_UgxwHt4o3…
G
@BaltimoreBazooka call it something different? at least? art can be a wide varie…
ytr_UgwidP8bc…
G
The lobster one shows that the AI don't understand the concept of 'harm' in util…
ytc_UgzFV1y7Z…
G
Another economy would emerge among those who were replaced.
The question would…
ytc_UgxF-MbYu…
G
3:52 to 4:02 is extremely unrealistic if not the most unrealistic outcome. Those…
ytc_Ugx971WEP…
G
This is the Future, soon we'll all be like these and have robot friends instead …
ytc_Ugh4D5Fkm…
G
Just make UBI to 50000 dollars a year per person, and have everyone except the f…
ytc_UgwkG-wBh…
G
Just one of the many major flaws in 'autonomous driving'. The idiots in the com…
ytc_Ugz7rctvL…
Comment
We don't currently invest enough time, money or research into safety measures against a potentially hostile Super intelligent AI... so yeah, a Skynet style scenario isn't completely unlikely.
youtube
AI Governance
2024-06-11T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxLXt7IcKXx1OgWDwl4AaABAg.9wdy4Rs0GcsA4YAIFfRO3V","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxLXt7IcKXx1OgWDwl4AaABAg.9wdy4Rs0GcsA4YPzcsJ4Ke","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzEstUayZhCpajDOHh4AaABAg.9wc6Eb2Fx7l9xqpJ9y93J-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzEstUayZhCpajDOHh4AaABAg.9wc6Eb2Fx7l9yh_sGtf6JJ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwJRS2gLugovS02taJ4AaABAg.9wbofxctIx39xqIEc3gvBv","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgwJRS2gLugovS02taJ4AaABAg.9wbofxctIx39xqfuHIaluG","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx8r84v_JpAF_1bOpN4AaABAg.9w_PX3ujyL59wf_-__KUbT","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz8XXbHD-Yp5IpBHFh4AaABAg.9wYr6TnX9LO9wlGHmOB4PH","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxE39q66FlihxuyOLl4AaABAg.9wXyaW_nQ_i9xr_CTNgPRe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxE39q66FlihxuyOLl4AaABAg.9wXyaW_nQ_i9xthVI0tMPK","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]