Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“It’d be naive of us, Mr. President, to imagine that these new breakthroughs in …
ytc_UgxdzJXBf…
G
I dont think AI in itself is dangerous, its just that people (specially those in…
ytc_UgzjN2roO…
G
I worked as an online tutor for a couple of years before they sacked all Canadia…
ytc_UgwQhDyFD…
G
The fact they used the internet to train and teach the AI is all I need to hear.…
ytc_Ugwm4vG8-…
G
Is telling LLMs that they are not conscious the same as telling women that they …
ytc_UgyKGv-lz…
G
If 90% of people lose their income due to automation, who will pay for the goods…
ytc_UgyPp0QTu…
G
Elon the hypocrite who talks about how AI can be dangerous but at the same time …
ytc_Ugx94Iddt…
G
Its almost like theres an agenda to keep people reactionary and the ai cant but …
ytc_Ugz87wcmJ…
Comment
Stopping AI development could indeed push companies to focus on safety and control. This might encourage a more responsible approach to AI technology.
What specific controls do you think should be implemented to ensure AI development is safe?
youtube
AI Governance
2025-12-24T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzypNNcam96B5R8B_d4AaABAg.ARFl5CLkO6gARSiml-3wFO","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw4v_ieZsuw5YF9Z3R4AaABAg.AR6x64bJYbTAR788Aitp51","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyPzcOARjovWMmxcox4AaABAg.AR6mJPGbOPmAR78_uPxVZk","responsibility":"society","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzpmRnU3rp2UH6oLMl4AaABAg.AR6iKnZVjDaAR6kbassbso","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgyXLdAhFcPIoYkDWKd4AaABAg.AR6__uKGvz3AR79TWeRuKl","responsibility":"society","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzZhN-lFXeqvN6zyGF4AaABAg.AR6ZDst9948AR6lSH55djz","responsibility":"society","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzRz1pVegLFhcoPVb94AaABAg.AR6V8CxttjGAR6mNOPMxp8","responsibility":"society","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxShAN_wkwPIS5YUfp4AaABAg.AR6U3NZRe-LAR6n2iM5mmI","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugxo2VuSFv7mAO8QiCh4AaABAg.AR6SuTATNFPAR6oRj2vG6H","responsibility":"society","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw6XJn67qNPosCBArJ4AaABAg.AR6Kh8v9kHdAR6or9-mDTI","responsibility":"society","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]