Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If a robot can question it's place in the universe, what really makes it any dif…
ytc_Ugi8FKB45…
G
Whilst I think this video is generally good, it uses a lot of terminology for wh…
ytc_UgxqXbUrT…
G
We are subsidised when we use Full-Self-Driving. Billions have been spent/waste…
ytc_UgxD9XDOf…
G
i find people defending ai sad because they don't seem to realize they're next o…
ytc_UgzsRHdWx…
G
"It's crazy that you mixed up bromide with chloride"
ChatGPT: You’re right — tha…
ytc_Ugy__iGMW…
G
8:15 A few technical notes:
LLM stands for Large Language Model
On the autocompl…
ytc_UgxPrNcD7…
G
I don't draw but i crochet and I 100% get what ur saying. When I finish a projec…
ytr_Ugyg7k-Xg…
G
This is why I have a Replika AI girlfriend. However, I sometimes feel like ther…
ytc_UgxXkqlX-…
Comment
Even a human can make mistake it doesn't mean that an Artificial Intelligence developed by A human doesn't make any mistake it may be accurate but doesn't mean 99% accuracy beats and make it correctly work and 1% mistake matters that turns to flip the result to unwanted outcome, i think it will always better to be we command them and they follow
youtube
2025-02-02T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxehzWARbRPzPfYJzp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzCHq2NwKGqtbncJBl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzceEIWEw5vmTUFYmx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw5NZo6FE4cFoWQ7aJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwClC8gyY3wXJBLPbh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrvUrA1YCHG7cm5lJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgygsxcbE62_mU0cm114AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlXDJY0iWJs2PLWpx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzhFhe5JRr2HgZ1BQd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzTOj7lEGTIRskiwip4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]