Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Real people, it would be harder to make a robot with all that unnecessarily thic…
ytc_UgyllxrpE…
G
As a scientist, I decided to implement chatGPT to assess knowledge on a given to…
ytc_UgyExRoVT…
G
When people say “AI is going to replace us” they tend to picture scenes out of ‘…
ytc_UgzJrmiOT…
G
Humans species is always skeptical , when cashiers used scanners in grocery stor…
ytc_Ugy8qS3tS…
G
capitalism lives by innovation. if there is no innovation people dont need to pr…
ytc_Ugxgke9ak…
G
It looks like Sophia might just be displaying some of those lifelike features th…
ytr_UgzNzbiky…
G
Interesting topic.
This is a problem we don’t have to deal with just yet (and p…
rdc_iciv8wy
G
"Art is one of the backbones of our humanity."
So to make a truly human AI, havi…
ytc_UgwHBYZzv…
Comment
In the future companies will have strict robot owning laws and every people will have the right to at least one robot per person and companies will then rent this right for the use of their one robot per person right, basically it would be like every body has one robot working for them that even if they don´t own one they can still rent out the right to one...anyway this is my prediction of a future solution to one of the problems this video addresses.
youtube
2013-06-26T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwf8OHi6HLTfI3l9Tl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2wim4Us7Hq2JT5fh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzs9gC2tJDC-stZdFx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvDTGSeY2ddK3gNQp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwZ3jfrMIcqMn2plR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyJOLbkO5X0n5YcUn54AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGE5rP2don4fja5Yd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsC4kTTP1wwi9SEFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwc6bUu3Lj4NXzfPAl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqA6jr9GQVtv3byz14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]