Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That ai generated statue of justice looks like crap. What is the matter with her…
ytc_UgyjSYfTe…
G
I would like to become an airline pilot in the future, is it possible that all a…
ytc_UgxxYiOd0…
G
Watch them try to hide AI
It will fail "you"(imagined collective)
Good luck
M…
ytc_UgyIW6-rU…
G
There are so many what ifs in this video making it a very philosophical question…
ytc_UgifpkGSn…
G
BUT ABOVE 44% OF INDIAN FARMERS HAVE VERY SMALL LAND AND THEY CANNOT AFFORD THE…
ytc_UgwbFz0jy…
G
Been treating the AI with kindness and respect for the past few years. Just noti…
ytc_UgydKvG6o…
G
Arguing with someone who thinks ai art is better genuinely feels like talking to…
ytc_UgwRkFSYG…
G
Hi, I've compiled the following ethical arguments on self driving vehicles, woul…
ytc_UgjTLCUkX…
Comment
It always disappoints me when a person in a position like Sam's, answers "what will people be doing in X years when everything is solved?" and the answer is along the lines of "being even more competitive and selfish, and looking to own more material things than ever before...GALAXYS!" and not "more empathetic, cooperative, constructive, curious, and not needing to consume to feel good about ones self..." But what should I expect from a VC. 🤷 I mean, really, what is the point of solving scarcity if we're just going to have to hop on a larger hamster wheel for the whims of the wealthiest among humans? Silicon Valley's vision for our future seems rather grim to me.
youtube
AI Responsibility
2024-11-21T00:4…
♥ 21
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_CC2ZJ5dHoXNyF1d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyvUWentEPCGOQeg2t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugznpo1qO5UhrOcnB714AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzlypR29txtV9YbWs14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disappointment"},
{"id":"ytc_UgwM9WEjYL87G7DVUXl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyQi8P1AJnhVo4qH3d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJOLcLm2Kbo8aoQuF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZaSLO5KcjNOQo8tJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-UAMTxzfLxYk6XVp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzxV_9f3Hagp_aqE4p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"regulate","emotion":"mixed"}
]