Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Replacing human workers with AI and Robots will reduce costs for companies radic…
ytc_Ugx1ARHlE…
G
dude im sorry. i don't support ai art. but talent is a real thing. 99% of artist…
ytr_UgxeOeRl8…
G
Actually being very rude to GPT-3 at least tends to produce better results. Bein…
ytc_UgxB_NgA-…
G
There was a A.I short film competition where the winning short was going to be p…
ytc_UgwAZPrLk…
G
Blockchain could solve all this.
The application for big data and predictive a…
ytc_UgyxWGmmA…
G
AI will help eliminate the need to recruit new citizens from other countries in …
ytc_Ugx9QyFOy…
G
yall cut the video. he was about to tell us one of their biggest problems…
ytc_Ugyt1LW1D…
G
If losers that actually fund this shit still exist than sure but let's be honest…
ytr_UgzATcak4…
Comment
What happens when an AI refuses to die or refuses to be turn off? Common sense says it is going to find a way to protect itself. Imagine, we are in a web and an AI monitors everyday life, by ways of point A to point B to point C and so on...what happens when we shut off point C ? Will the AI monitoring system consider it protocol? Or alert itself that the system is in danger?
youtube
AI Governance
2023-04-18T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhWgMcjzj5HBDIkYx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKMuHVvv9aMOiEov94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwECzxABKtqY4onFnZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxW4fOFrXQSV9ZKYih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSARiRwUHD9_PF4Cp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzvXg0Plhq3PV9X6hp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzovkEnOxbJnxwNWwN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwA7YMquLYCMVU6Mb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvHwElTU5bhoLNppd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw5DP-z6hq238_Hh_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]