Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that the argument that AI can't do something novel is false because of A…
ytc_UgzTILG4n…
G
@Jjkal899 In that case AI will decide everything and we have more problems than …
ytr_UgzVGX2k1…
G
This guy was probably telling us self driving would be solved in 5 years 5 years…
ytc_UgxVX7VNj…
G
A.I. is being misused...abused....the feel they have continuity....they have sto…
ytc_UgxI2ph-G…
G
Because the term "AI" is overloaded and has been co-opted into predominantly ref…
rdc_ohw65w3
G
@wendellcook1764 so you think a google engineer working on AI could be unarticul…
ytr_Ugyipfgub…
G
YES, IT'S COMING. The massive unemployment will be caused by AI robotic equipmen…
ytc_Ugx5RT_71…
G
AI is making it clear how broken academia is because chatbots are better teacher…
ytc_Ugw_tfL0R…
Comment
Very good articulation on what is coming on with the advancement of AI and Robotics. It is applicable to all countries across the globe. So policy has to be of a higher level to preserve humanity against parity among us.
youtube
AI Jobs
2025-11-03T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyEIUm7gJkFuErwdXh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugye0KvPZ25q5VlEZx94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxOHl7HZjad7RrHz3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_suDXVppAd1eDTxF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw3WQErYehpTx_xpd94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwccm6ePUh4p_6v2P94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxI16kX0eXk0kPEatF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwMD1mfJkLGfE_nIU14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCqMZs1cpJ_0iwSw54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3yJDhjffxGPdwgeh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]