Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This reminds me of I Robot with Will Smith the robot in there was named Sonny. V…
ytc_UgyOfWM4V…
G
@jabrokneetoeknee6448 not about unions, it's about losing jobs to A.I. The irony…
ytr_UgyE8VHJ9…
G
You confuse me, Sabine. Firstly you are full scale materialist and „believe“ tha…
ytc_UgyJVrAy2…
G
Hey there! In our video, we explore the meaning behind the name "Sophia" and its…
ytr_Ugz4DibsC…
G
Robotics and AI are growing faster and faster as time goes on and looks cool now…
ytc_Ugx-xZAwv…
G
@AaronTheHumanist There is a headphone button next to where you enter your text.…
ytr_UgwxI6HdQ…
G
They still haven't court tested liability. In the end, just like VR, and soon to…
ytc_UgycfADOL…
G
The only thing I and a twirp like Musk can agree on: A.I. is bad for humanity. M…
ytc_UgygVPgz7…
Comment
If two armies fought each other with autonomous weaponry, and no humans died or suffered.who would decide who was supposed to say uncle.is the gamer in the control room stateside.your right otherwise it's just a game,sad but true humans are tied to the condition of humanity.how juvinile and inconsiderate the war industry is.and the idiots get picked to go to war not the rich.thats the one that puzzles me.
youtube
2020-01-28T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzB0hWsFrZCkHqYAXF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnEOAU8JpZ64qGrs14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyPxPXM5LLBrIvCvdt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHAG2eRj_MmPhkAbx4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzkkDtz_SMxzcO0Ml94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyttFx0rgxVHFiysf14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuZKgDqAeAY6WTA894AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyL7xH8D5Q96XRCACN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwvoRfzxk-72qJWiA54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpnozIsuwsKwHGPcJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]