Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is a quick extracted table of the overall results of the CCPI 2017:
Key: '…
rdc_da468d9
G
Surely 5000 people dead by auto cars is too many for the amount of self driving …
ytc_UgzuIy3mQ…
G
I'm a truck driver. Looking at what the current state of AI is with regular traf…
ytc_UgxNETRSX…
G
What is even going to happen if we get to a zero-growth environment because no o…
ytc_UgwL_5T03…
G
AI and automation will be short lived for one simple reason. Products require up…
ytc_UgwDH4ZCO…
G
Im afraid of AI because I don't want to be put in a hover chair because the Ai m…
ytc_UgxsfgTK7…
G
With AI replacing Jobs, there will be no need for the workers. Rich only feed yo…
ytc_UgxPd14PG…
G
If AI males a bad decision and soneone dies, no one has to take responsibility f…
rdc_i2vtte0
Comment
I wonder the exact interface / UX / UI & type of training required to effectively leverage & navigate neuralink 😮…
Sounds like neuralink will be dependent upon individual thought process… making customization highly imperative & controls not so universal maybe?
AI: “Your premise is correct: any application using a brain-computer interface (BCI) like Neuralink (especially for creative output), would depend heavily on the individual's thought process. This would make customization not just important, but essential, while universal controls would be difficult to achieve.”
Cyborg sounds pretty cool 😎 😂
youtube
AI Governance
2025-09-26T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyeVqR8euCstj73d8R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEXtU6Afktx5FIlL94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyBQ4JXvn8nZPW_1Uh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwR6_koRWF4pHhGO294AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzLVJ30CMYeCONSad14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwbh2h-70XRPQQLrHR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgywOQNT6fwhAmvZlEx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"disapproval"},
{"id":"ytc_UgzI8G12uL_DdiXXH2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgweXdT2YJ_k5uOctdp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3SOtOxG3kzYEZJPZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]