Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not necessarily. Lots of people make their code open source. Guess they would te…
rdc_j6ebdmn
G
It’s crazy had this idea in the 1st year of high school. Teacher talks 40 mins,…
ytc_Ugw3W3sc7…
G
I have another reason in mind, but I'd need a neuroscientist or a psychiatrist t…
ytc_Ugz_BN77i…
G
I find it really funny that using ai art as reference can lead to mistakes that …
ytc_UgxSa53nm…
G
All AI talk is nonsense. Bring engineers on the table, back end developers. Ai d…
ytc_Ugy6bbe1N…
G
AI is dumbing down humanity not to mention that it will be used for nefarious pu…
ytc_Ugxi1X0KT…
G
I was interested in this guy until he said he wanted to live forever. People lik…
ytc_UgzoFp1Hy…
G
Where is the carrot? If humans loose the power of labor what power will they hav…
ytc_UgyKYDQBx…
Comment
Theoretically, AI could be used to achieve incredible things and be a massive help to all of humanity. In the real world, billion dollar companies are going to use AI to become trillion dollar companies while eliminating all of the jobs that aren't managerial. There's gonna have to be a universal basic income or anyone who isn't the child of a politician or a millionaire is going to be boned.
youtube
AI Jobs
2025-10-09T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxnZFS_ZROH31AYPDJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy3n1yFpbp0IxEFikB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyUZmSlmm78aEM6J6N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwpjcICj2EpkXGJrJp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzfHJCabL18Kq8vlsR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQnBLHWmiO0CzBuu14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgypdW-KN3ZtzLDyWll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUdIF7kmVJqK11Mvp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwf0RiXvPVgt2sGkYt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw4xPsXbW6kUs2BIgZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]