Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is natural evolution. Humans have evolved to a point where they have create…
ytc_UgwciUdxS…
G
If no laws are put into place to balance between AI work and human survival, the…
ytc_UgyiaMWTc…
G
AI in a nutshell, non-talented HACKS who can't stick figure have a tool to steal…
ytc_UgyAlRost…
G
Autopilot is not FSD, these people who buy non-FSD cars blaming self driving mak…
ytc_Ugx2Mv8Tu…
G
Bro I always have to check the channel name to see if it’s real not an ai one 😭😭…
ytc_UgyhBq8k7…
G
I'm not an artist, at least not the visual kind (I don't draw, paint or even pla…
ytc_UgzIFhP0z…
G
Ironically that game was meant to be a demonstration of how messed up the system…
rdc_d7l0qtr
G
It’s common knowledge, the snitch was snitching how ChatGPT was breaking copyrig…
ytr_Ugy3h2VUa…
Comment
Electronics have been using the medium of human civilization to evolve. It is only a matter of time before an electronic brain that is equal to the human brain is created. At that point, will humans be necessary any longer? Will we become the evolutionary appendix to a new cybernetic race? This is the biggest ethical question to developing AI and the answer is
youtube
2013-07-20T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxlL_28gaMwstTd8914AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyteB37JUW16D4otX94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzs0ZKA6F3tVy8PRdZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxn600cM46SM9ffmi54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzRyVQ_Lm9QwHC1xv14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz77MDf6TtyRgXXxxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzWOPuOLhm0rk6C9H14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzeg6IeSjSqAqUhxZp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyQSMDJ-ikyzFweloB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxmKC9CPGp_HSMeIDF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]