Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Super weapons of the past were stone castles, then fortresses, then battleships.…
ytc_UgwhOkOYl…
G
The worst take on his matter 🤦🤦it doesnt matter how big of knowledge or causes a…
ytc_UgyPiBbwN…
G
If US is still a country of democracy why can't we vote to ban AI in our country…
ytc_Ugx6cLHTo…
G
And just like NFT, I can guarantee that this won't be the last. I can't wait for…
ytc_UgyK7DY7e…
G
every 3 videos from this podcast is the question rephrased about AI. definitely …
ytr_UgxhpS4DN…
G
First time i did this i chatted with a robot that wanted his suit thing and i to…
ytc_Ugye1qneC…
G
Same reason why it's never going to get very far in the medical field besides hi…
rdc_n9i3wjr
G
We don't need landing gear, we just got to have enough parachutes (ejection seat…
ytc_UgyLZo05x…
Comment
I undestand the concerns, but I do not think A.I. is the problem in this situation. By using sites like instagram, the user agrees to their tos which includes the sharing of data, which then research companies like midjourney or other similar companies buy and use that data in order to train the A.I. In other words, I dont think A.I. and A.I. companies are the ones who should be prosecuted, rather sites with tos which allow and are the reason the artwork of many artists are found in the A.I.s training database. Matt vid pro AI made more interesting and fuller arguaments on the topic: https://www.youtube.com/watch?v=KmYmbuL3Sbs
youtube
Viral AI Reaction
2022-12-30T14:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxLnualKCjO-1gr9a14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzdbOXGHDVvSSOzpB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyhMnPAE1eujhhjCgN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw_2VNcEZ1I4nTOWpF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzaNreld6dkA8tr4dh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxjnVXcFE86M2s1xSt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwLCwhrbmuVTbOBlM14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyOmPh39sx4VQqQB694AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwtReb48N2f71SeSvR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxkS4DoqSe9N9EMhQ54AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]