Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He said that he left Google so that he could tell the truth about AI. (In anothe…
ytr_UgwQvO7SY…
G
Shocking lack of tact, decency, empathy, compassion, respect for the dead, class…
ytc_UgwUbwdDg…
G
But there are still elections in a democracy so I would expect people to vote pr…
ytc_Ugzos0qL_…
G
What is feeling anyway? Simply an alert system wired into neurotransmitters to e…
ytc_UgxXSl6Zf…
G
not pro ai, but yall do know how burning it is to learn how to create art right.…
ytc_Ugw3igaoG…
G
Most of our current SE1s should have been promoted to SE2s months ago but they'r…
rdc_oagmn2q
G
It's not the ethics of AI that's the problem here, it's the copyright enforcemen…
ytc_Ugy75lrWY…
G
I do food deliveries in Sydney for a Chinese company. They have been trying to …
ytc_Ugztkihi9…
Comment
real AI is a bit away, we only have predictive and learning AI rn. any "AI" rn if you connect it to important infrastructure and military with directives can cause harm and seem evil, but no its just how they were coded. NO AI RN can actually think on its own they are still just algorithms
youtube
2024-12-14T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyKKiwE_kL19rfECoJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwGn-hT3gMy0Z117cl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgyXdnfZtrDPtOUnYw54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugz13kH4sL1CAeQSUGV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugzx0ght2MWCFOokVIJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzANJgrF-Dsb5YZxoZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyN9Bu3fDBgmCSs1tN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwmeAPhfyKAMNMX98h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgzFqF5DcAFTL70PiiV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugx1ZlTVsPMIyiK380t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]