Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is stupid. I asked ChatGPT about my coworker and it responded with basicall…
ytc_Ugx99bU6X…
G
U mean all change to these job but the end if all can make it also become supply…
ytc_Ugy_2jexP…
G
All I've ever fantasized about was a sleek Asian boyfriend who wore turtlenecks …
ytc_UgytAjFWM…
G
Nice to see the NYT take appropriate action. To save you a click, it looks like …
rdc_odieenq
G
@JohnSmith-x3y8h Tesla doesn't provide any report on the accidents because of th…
ytr_UgxVm54_l…
G
the 2.4k people who disliked this use ai to generate false art, prove me wrong…
ytc_UgzxfDZ1H…
G
Lol.
Wonder if this their way of having the memory cartel save face after the…
rdc_od412hi
G
(chuckling) Well... we can stall all we want *now*, and pretend (to each other,…
ytc_UgzCXyoSU…
Comment
In really don't see why developing advanced AI would be a problem in itself. However, capitalism and robot workers doesn't go well together. Robot workers will cause a lot unemployment for sure, probably so much in fact, that human workforce will no longer be needed at all (at least close to). I think that at this point it is time to redefine how the whole society works. Basically, you shouldn't have to work to survive anymore at this state. Since a human workforce isn't needed anymore, and there isn't enough jobs for everyone, why force people to work? Let people have free food, free homes, free devices, and get rid of the concept of money. When people no longer have to work at a shitty job they hate, but rather use their time to do what they find interesting, i think the results will be much greater!
youtube
2013-12-05T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj8AXUuhgfjUXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghEfYIiBlCtyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg6pJ8sg8sIuXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh4Izu1dFDCBngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggStT0fkttiU3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjHME_FVR-RjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgiFPP6fP-f4CXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjk-OLPfqT00HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugj_Nwoh-nEukngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggdtWoUYVl_S3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]