Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fatal flaw in Tesla's autopilot is that a computer is just a software algori…
ytc_UgzJlwQK-…
G
All those outrageous statements about Ai taking over mankind are human-made inpu…
ytc_Ugzn0JTYh…
G
AI relationships are reflection of bad society, and the need for these kind of r…
ytc_Ugw3IuEVU…
G
id argue that the driverless /would/ be able to stop in time. they may be rear-…
ytc_UgyLGAxBl…
G
There should've been the opposite rule,, say plum if you can't say no.
Also if y…
ytc_UgxwY_ooJ…
G
The problem isn’t if AI will decide itself to do harm. We already know in the ha…
ytc_UgwX70FJv…
G
DeviantDespot no it is not. This is clearly a marketing ploy. Real self-aware AI…
ytr_UgiW6uLvb…
G
I truly think you're way too trusting of Tesla. They ARE death traps. The auto p…
ytc_UgwH_5oQP…
Comment
LOOOL so the sponsor reads in this video are basically pointless as AI is better at even the stuff he's pushing. I work with AI and you can now even ask it to build a photoshop or other similar tool in a web language and run it either in the AI window or upload it in your own server. Even software is pointless because you can just ask for what tool you need and it can make it for you.
youtube
AI Governance
2025-09-07T05:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzSZfLnyk-TwrRJhop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCI46_ZgY9VLM33YZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAzaCyEhMNf_NTQ6d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz16Sw7YvB3Zgyyczt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzslxt-uMq7kQ9e6eF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzp8okus-NqROF3EHZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKa3jjoMgHfBzj22Z4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxp11JpCuiX_WQ7p5d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWK-v44Dzmg_gOb6p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwA9rYW3lHufxqYWpJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]