Details
-
AboutBackend developer / system admin
-
SkillsC++, C#, Java, PHP, Kotlin, Python, HTML, CSS, JS, Unix, Apache, some docker and lxc
-
Website
-
Github
Joined devRant on 3/18/2019
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
-
You called him the R- word?!!! That's just inappropriate! >:c
-
It's ok to use Windows,.or mac, or whatever you need. Just don't blame it on linux that you have to.
I mean damn, use whatever you want, but why do you have to shit on a good thing for no real reason. The solution to your problems is to make linux more popular so that companies give it the proper support. By shitting on it for something that's not even It's fault you're making it worse. You're acting as if It's our fault that linux can't support locked proprietary formats and that companies don't compile their closed-source software to run on linux despite doing it for MacOS which is almost the same thing under the hood. -
@jestdotty hmm, maybe I shouldn've used the word "manipulate". The point is to just ask for the information in pieces rather than all at once, there's no active manipulation taking place, it's just the same form properly built-in into the actual story and gameplay, which is good design
-
@jestdotty You still don't get it. There is 0% gaslighting intention because there is no intention. It's not a percentage game. You're assigning intent to an algorithm and that's already a mistake
-
So you hate linux because no one else supports linux? Genius. With this kind of thinking, we shouldn't even have free options at all!
Fucking stupid piece of shit graphite pencil, it just tells me to fuck off when I want to write on plastic bags, Markers don't do that! I hope pencil fanboys won't reeee at me :) -
@lungdart Thanks for the free coke!
-
cont'd
LLMs are not even capable of randomness unless. Current LLM models achieve "creativity" only by random samples *after* the Neural Network processing. So any sort of "novel" or "creative" is not even an inherent property of neural networks, it's a mathematical trick that is almost as old as math itself. It's a random choice based on probabilities (or compound probabilities in case of beam search).
It's a topic of debate whether humans have inherent random properties thanks to quantum level effects or not. I wont claim to know if we do. I personally think we don't, but even if we do, it's a property that's part of the entire thought process and not just "word prediction" -
@lungdart It is different because:
1. we're not a "random neuron connections" The brain has multiple specialized areas that are common to every human. We do have a center that does predominantly speech, part that does life support, the outer layer that encodes memories and so on and so forth. There's nothing random about brains. It has evolved a specific structure over thousands of years where language processing is only part of that.
2. Human language synthesis is not a per-token generation. We don't put words in front of words based on previous words. We have a complex abstract conceptual thought process. (as very simple examples: how many times do you go back to rethink a sentence, or know whole paragraphs ahead what you actually want to say). If you think we do less then you sorely underestimate human brains.
3. It's not creating "novel d&d campaigns" it's a conglomeration of things it had seen already. LLMs are incapable of novel ideas.
cont'd -
Lets get rid CEOs and Indian scammers
-
@jestdotty You just don't like that I used the word "manipulate" but it's not meant to be nefarious. If it's used to enhance gameplay it's a good trick to blow peoples mind. It's not worse than what phone games and social media do already, except they actually sell your info to the highest bidder and you have nothing to show for it. In this case at least it's to improve the fun. Besides, using an indirect approach like this you're not going to get important personal information, only what people are comfortable to share in chunks, which is more than they are willing to share all at once. But it's not like someone will suddenly dump their social security number there. You gotta chill a little bit, this is a game design discussion, not an indian scam instruction
-
@jestdotty That's the thing though. It's not programmed for someones intentions. They would have to train the AI purely on gaslighting conversations to achieve this, it can't just be 5MB in 500GB dataset, it would get lost in the noise. Simply speaking, the LLM has no idea what it's doing or saying, it has no consciousness and it cannot plan for anything, it's a direct A -> B mapping of input to output. Any seeming gaslighting would be just pareidolia on the human's part.
Not to mention, I never had this experience with any of the models, I perceive them simply as dumb, not deceiving, because that's what they are -
Well maybe this helps you to in future design. But for me this is privacy thing. I don't put my full name even into offline names. I don't trust any data that leaves my keyboard. I think a lot of gamers will think in this term too.
Besides, the name isn't really important. Whatever the player types in, he's going to identify with that name. By forcing player to choose both fullname/nickname you only forced them nickname and random garbage for the full name to obfuscate their identity. I assume you wanted to break the 4th wall by using their name or something, but many gamers are too much versed in internet privacy to actually put in the full name anywhere. First name maybe, but Last name you're not getting
I suggest you design your game in such a way that you can get this information from the player in a more manipulative way though... Instead of forms, try using character dialogues or indirect quizzes to ease the player into giving the info without even knowing. Make it gameplay -
@jestdotty I only disagree on a technicality. The term "gaslighting" is defined as:
"Gaslighting is an insidious form of manipulation and psychological control. Victims of gaslighting are deliberately and systematically fed false information that leads them to question what they know to be true, often about themselves."
It may be unconscious, but it has to be deliberate and systematic, which LLMs are not capable of. There is no long-term planning done by LLMs, they only generate retro-actively, given past context.
I mean I know what ya'll mean when you say that the llm is gaslighting, but it's terminologically wrong and for whatever reason those things always rub me the wrong way. Probably my autism speaking, but I think language is important.
But whatever, I already know we don't agree on LLMs xD -
@lungdart Well, if you ask anyone that knows me, they'd tell you I definitely don't put humanity on any sort of elevated place. As far as I'm concerned evolution is a random walk, we're just *good enough*, free will doesn't exist and humans are only animals like all the others.
So that definitely isn't my problem here. In fact just using ocam's razor you'd end up agreeing with me, because assuming that LLMs can lie and we have achieved AGI is a pretty big freakin' assumption. The space you are in sounds like a delusional echo-chamber and it sounds like you never wrote nor trained any kind of neural network or other ML algos.
But maybe it's you who thinks that you yourself is special and that you somehow gleamed the ultimate truth and know for certain that LLMs are now conscious and can "lie". You ofc don't need any further investigating nor evidence, you just kinda "know" I guess?
But hey, sufficiently advanced technology that's not understood can look like magic I guess. -
@jassole But that's exactly what I'm saying. There's a place and time for GC languages, same as there is place and time for low level memory management languages.
It's short sighted to dump on GC languages with the argument "programmers dum now, we must memory". They don't threaten non-gc languages and even if we agree that GC makes it easier even for bad coders to do decent work, that's still good for the rest of us, cause we get to do more interesting work. -
@lungdart "being wrong" is not lying.
Lying implies intent. LLMa don't have intent and they also can't say "I don't know". They will finish the discussion no matter what -
@TeachMeCode the Problem is we keep assigneing human properties to things :) it doesn't gaslight you, it's just multiplying numbers until it gives you a string of words that kinda fit together within the context that fits into It's input vector.
No matter how big we make the context, it inevitably will always halucinate things or fail to give Attention to important details from the past. We just have to use it as such and not expect "intelligence" from it -
Honestly, Web3 is a smokepipe joke. It doesn't actually exist and it already missed It's time to exist.
If someone actually makes some breakthrough in the way internet is used and serve, I strongly suggest they call it Web3 instead of web4, to override this old definition, because cryptos changed literally nothing in the end. We just have one more stock market in the form of ICOs and one more currency (well 4000 more) that are volatile as fuck and are more akin to gambling than investment. NFTs are an outright scam that even Ubisoft regrets entering and now It's done only by financial chads that think they have a good idea, or by indian scammers and the metaverse was dead on arrival and the only thing left is meta itself. Web3 is something that Marketing teams wanted to happen, but it didn't and now it wont.
The only thing still kinda kicking around is decetralized services in general, but that's barely anything new and at mose deserve the tag Web2.1 -
Yeah, I disagree as well. GC has It's place. It's not a one-off replacement, it hurts performance and produces larger binary files, but it also reduces boilerplate memory code in many places and allows you to work on business logic rather than memory management, custom allocators and fixing bugs that inevitably will arise no matter how godlike you are as a programmer.
Yeah, it makes our work more democratized and more people that have little idea what they are doing can code, but that's not necessarily a bad thing, because they will learn, and if they wont then they will replace the low-code roles that none of us really want to do anyway.
And on the other hand, where peak performance is required you will still find open positions for good ol' C and C++ because paradoxically hard-to-read, black magic pointer arithmetics code is the fastest code, and that's a Skill that's sometimes still necessary (just look at the current LLM boom) -
LLMs as AI are a joke... but you just can't explain it to people, they still lap it up.
-
For me, the selling points that remain is that they are extremely small factor and they are ARM based, which also draws less power. So technically servers running on Pis or similar SBCs are still more power efficient... I use my Pis to host a NAS, Gitea and Syncthing. But I also have an x86 small HP PC that runs some of my heavier services now, like Jellyfin and Pydio
-
@kiki I don't need to argue either, I can leave the arguing to researchers.
Check out the paper "Are We Monogamous? A Review of the Evolution of Pair-Bonding in Humans and Its Contemporary Variation Cross-Culturally"
The conclusion is that humans are historically mostly monogamous and dozen of evolutionary traits we posses or lost suggest that while our ape ancestors were not exclusively monogamous, we evolved to be, which gave us an edge (I mean look at how many of us are there :), clearly its working )... While there sometimes are cultures who are polyandrous or polygynous, the predominant strategy is monogamy.
It may work for you, Let's be honest here, you're the odd one out, but for many others it might only lead to pain or at best lost time "playing around" during their peak age.
So do what you will with that information. If you're lucky you wont coerce many people into relationships that will cause suffering to them or other, but that's on you if it does :) -
@kiki I don't tell you how to live your life, so don't tell me how to live mine. You're advocating for degeneracy and hey, you can do that, free speech and all, but I disagree with your rhethoric. If birth control wasnt a thing, polygamy would be a huge problem for our society. It's like you're a vegan taking supplements, yet still advocating how super healthy being a vegan is. We're not a Tribe in a forrest anymore
-
@kiki
> Sooner or later, everyone wants to fuck around
That's just not true, different strokes for different folks. Lets not pretend we're all adicted to sex and need to fuck people left and right for new highs. -
I have a love/hate relationship with dart.
I like some of the language concepts and I don't mind working with it, but I hate all the nesting and reactive stuff when working with flutter -
Those are not even equivalent, but I guess close enough to make a point.
C++ and std::vector would make it pretty much identical, but pure 98 C.. yeah . You gonna have to get low level, but at the same time with that C code you know 100% what It's doing, whats the size of the memory, how fast it will be accessed, where the data lives.. everything.. complete control... While that Rust Code probably uses dynamically resizable arrays, which already adds some mistery regarding constructing, destructing and allocation logic. Not much, but It's there, and It abstracts all of the low level details. Which is good 99% of the time, but yknow -
Which one? North korea?
-
I know at least one that still does it. And Im thinking about trying it out since I WfH all the time anyway
-
Potentially it would be very easy to do with language that support Aspect programming like Java with AspectJ
You could globally wrap every method call and increase logger indent on entry and decrease on exit
I think even in python you could hack it together but not sure (possibly decorators would work well for this too) -
Find a Loving partner