Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well I think I robot is going to be a reality, where’s will smith when we need h…
ytc_UgyYxNg1R…
G
Wth is AI and tech people doing with this world.
These freaking people give me…
ytc_UgzuORkfE…
G
I see what you're saying about AI changing the roles of people and how they do t…
ytc_UgzMj7WBX…
G
It will backfire.
The richest people or corporations hoard the money AI create…
ytc_UgxTyow2c…
G
A big problem is that people who are not creative will gain false confidence tha…
ytc_Ugyupt4Qy…
G
I have thought about this and figured maybe with all that extra money they make,…
ytc_UgzwbA8Nf…
G
I use a voice to text "AI" system called Dragon DMO to write my medical notes. …
ytc_UgxqYesxE…
G
Yes, we are in the Age of Accelerated Transformation. The era of AI is a signifi…
ytc_UgyFb-GPe…
Comment
AI will hit a wall. As it gets more complicated, people will not understand how to use it. Theyll misuse it and it wont perform well or will cause problems. AI, unlike other technologies, is largely limited by people. We're probably never gonna have large, problem-free AI systems (ie controlling all the traffic in a country)
youtube
AI Governance
2022-07-04T21:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx4qxWJT1Lkb4w_ELF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzd-cw-_muiV1vLmmF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzMgTYvVE3ubddgIRJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzOW2MPsRaHIRTFp8N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzJldARhqfUwG_rICB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw0jQn-k3rIehzP3z94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWMRoBBP6WD3mzgz14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyIXpvRehDOuJsAiQd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwhRjEachs-kJ4LUVF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzatvAAZ2scKfnjun14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]