Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate this guy, Hinton. He knew 10 years ago, when he was at Google, the negati…
ytc_UgzL2DSk8…
G
@laurentiuvladutmanea The only EU regulations currently in the works are those p…
ytr_UgzOauwtf…
G
If “they” (AI) will become dangerous I guess we will have to just unplug them 😂😅…
ytc_Ugzs-9qbT…
G
Build your own sex doll for the sickos in the world. Do they come in two parts o…
ytc_UgwOqZeTl…
G
You don't remember phone number, or maps or recipes for how to cook a certain di…
ytc_UgwJ4Ghnp…
G
Things like Arcane is an example of this. There is so much meaning, symbolizing,…
ytc_Ugzif4kR8…
G
What is the purpose for coorporations replacing human jobs with AI/Androids, if …
ytc_UgywVL7zm…
G
come on , this is so stupid. Who really believe that world would change in 4 yea…
ytc_UgwWRG-ql…
Comment
I don't think you were too harsh. Get lazy get wrecked. However, the timeline for when human lawyers become backseat drivers to A.I. and then useless, is likely much shorter than most think. I'd say at current dev rate, maybe 2031ish and no later than 2033. We just can't admit to ourselves how unprepared we are when trying to understand what an exponential tech growth timeline means and so when we look at one, most will disbelieve their own eye's.
youtube
AI Responsibility
2023-06-13T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzxlPhJIzB2AZMmzwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylzcARabn1atr6c_V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy4ro_TENP-3-AM0h94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymSjHeZYZmY9ypNZN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyRqKHnQjSlgHj1kjx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0y7evlrGpDMzBNCh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLmvFFjkeTQY0pgA14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzqqvUubefMdeFIz7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwCad6OCQKdQ_-RTGZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx35aEuMDdZHHDd_ph4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]