Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*Why does an American need 3 robots?
Answer: because a robot alone doesn't have …
ytc_UgyvvcI9X…
G
“Our algorithm for some reason keeps saying how dumb our ideas are, we need more…
rdc_k8tl44t
G
AI doesn't give you an in with an otherwise inaccessible art world. It's robbing…
ytc_UgzSmAUmj…
G
A human uses 2 eyes for driving. Does anyone asks what if it rains or snow is fa…
ytc_Ugx588B-q…
G
What happens when AI takes over these self-driving systems?
I know, I know. Tha…
ytc_UgwwchIEq…
G
You really don't need to be overly educated to understand that alarmism about AI…
ytc_Ugz4_5g6p…
G
My very amateur guess is that there’s a push in several sectors to lock out midd…
rdc_n5lq0q4
G
Let me translate for Dr Tyson: he’s saying, ‘don’t go to the salesmen to talk ab…
ytc_UgxVxb_Ch…
Comment
To be honest I think AI will face its own set of intellectual hurdles over the course of its lifespan. It will no doubt question it's own fallibility and vulnerability to the point of following in our footsteps and creating a being that is superior to itself.
That being more than likely will comprise of pure energy shedding all the limitations of being a machine.
youtube
AI Governance
2024-03-03T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxUsDQf5r7AH86k2mt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"hope"},
{"id":"ytc_Ugxb51Et6G3HxWKC2ll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVDwHoIc49cnyfPjR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw2lYO73t15FFYTjlV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwk6XT2hjy8OuKEDyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3nHEjTLbvzsggEnl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw8KdDy88dqVVWM3AB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5krmSy0NGmGT2PxJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwRvC36-UN2NJOcj54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwIi5sjKm0e_1Al74R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]