Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i think Mayweather can defeat that robot, just keep on running untill the batter…
ytc_UgyGjlKJX…
G
That’s true AI. We haven’t seen true AI yet. I don’t if we ever will.…
ytr_UgxsVnzn_…
G
Did you watch the video at all? AI is racist because it's using data from humans…
ytr_Ugwhhxep-…
G
listen all the things accept by sendsers of robot Sophia she stored that in her …
ytr_UgwnJDGMq…
G
2:55 need an example please.(edit: halting problem can not decide to stop or ru…
ytc_UgznwOiEN…
G
Honestly the amount of people using ChatGPT and blindly trusting its efficacy at…
rdc_nk6i9io
G
what i said sometime ago.
https://www.reddit.com/r/cscareerquestions/comments/1…
rdc_n3kz7yt
G
Ai's have a certain context limit so some of the information they give it comple…
ytc_UgxkVdA_Y…
Comment
Training AI and training humans is identical in my opinion. At least it is very similar. I read a book, I hear a song, I eavesdrop on a conversation, listen to a comedy routine on tv or consume a free YouTube video. It's all input. Then my brain knows what it knows because I have experienced all these inputs. Now, do I have the license all those imports and pay a fee for everything I hear and see? If the data is premium, if it's privileged, precious or secret, don't release it. Stealing is wrong but learning from someone else should not be compared to theft. I will not going to delve into copyright issues, I am merely thinking about learning. Whether it's human or AI the process is similar.
youtube
2025-03-20T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzdy794pLSA6ZrYPNh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMfr8NriTyWry8ESN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyBKKb9IDB_ScNTovV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzDRX2yBI5GE5OvVlR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwG5oPB0sIt66wDZZN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0Bsqgt4oBSyn3Fod4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw7PkT2xsLkACWqICJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzyg0aJWpWBS9nyj9F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxuiRtF55SfydJHwhd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJ6xUaxhrKtnnuEhh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]