Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You should say “please” and “thank you” to llm’s because the companies behind th…
ytc_Ugw-_AMeq…
G
Ai doesn’t take references. It just replicates the art and just turns it into th…
ytr_UgwsiceYS…
G
This video is a case in point as to why I don't see chatbots accomplishing any w…
ytc_UgyLE9J43…
G
There should be a law that states if you use ai or replace staff with machines, …
ytc_UgwaJ93z-…
G
@daydreamer8373 Yes, FSD is impressive, but it's not L5 and it is not really "Fu…
ytr_UgwqiD-5J…
G
This is good conclusion I think. I refer to Copilot as a really good autocomplet…
ytc_Ugz0hwCta…
G
Honestly, trying to predict a human behavior without a human brain sounds more l…
ytc_UgwB_5Ljg…
G
They call him the godfather of AI because as a good practitioner of his culture …
ytc_UgwOY5kF1…
Comment
ITs already to late to really regulate anything pandoras box is already open a lot of this AI tech is now open sourced and easily available to anyone willing to try to install it on their own computers....AI is already on a fast paced advancing trajectory...some of the AI devs already realized that they could be replaced by AI to create even more advanced AI in just a few years.
youtube
AI Governance
2023-05-21T01:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyOJkbQzlqUxTOUvCt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwIDJ4R9KcayQWB5zB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8sm4DpNFzxdoKjTl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwybJRwzk7G4ejxWZd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzfO3cpLg_K9nvj7TB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyUUFO-pmHIWLrdWkZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwO0HyMYxaVdZu5fBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxrVqRN9klIQKMhSkN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzmKH-P9-MfAYkrcG94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxIgrtlVqcDLH5-P6t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]