Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While I appreciate Dr. Yampolskiy's expertise, I think these '99% of jobs gone b…
ytc_Ugw6ekVkC…
G
AI is fine. It is evolution. The common folk don't understand AI. I program AI. …
ytc_UgxO5Jww_…
G
@ I agree that they aren't, but the models most people here are aware of are at …
ytr_UgymTdHJs…
G
The presenter approaches this topic like a typical TikTok-professor: confident d…
ytc_UgzjcA95b…
G
@Spookatz. A common theme I’ve been seeing in comments is some variation of “I w…
ytr_UgxSg4HUj…
G
Stop with this bullshit. We literally don't need this at all what's the point. T…
ytc_Ugx8ptJ7X…
G
Take a good look at our future folks.
Automatic driverless cars and items being …
ytc_UgzmmoaM3…
G
You cannot create general intelligence from machine learning, at least not the t…
ytc_UgzCFq55t…
Comment
One way to look at the comparison to a nuclear bomb is that only two have been used to target humans and continued research lead to nuclear power. Limiting use in weapons systems is important, IBM's suggestion for precision regulation rather than regulating AI technology itself is reasonable.
youtube
AI Governance
2023-07-14T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwgX28PpJ1fcr50mdN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzkiosqpm1NZfJ_7L54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxae1kB1p4w0pDQESV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxL-gb7xmIYoqmjk5R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOjSi6J4BLXHTUqmx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzj8KLDVkPyGBHuVsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugzb9s25IAdudpN0zhp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzuMZtHKQ6RYcxLBI94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxVcqWNH7wsoPCbq3d4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwmguv5gu-PhkQvdmB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]