7
nitwhiz
1y

So how does every company have AI assistants overnight?

Are they all just using the openai-api?

Comments
  • 7
    Belive it, or not - There were nom LLM based chatbots before ChatGPT.

    IBM had a project running as far back as 2015. they still use it for healthcare support.
    Cortana. "Ok Google". Fukin SIRI.
  • 3
    Hype-train. That's all it is.

    It's just a way for companies to stay relevant, being like "pUrChAsE oUr pRoDuCt, wE hAvE AI"
  • 4
    And yet, it isn't going as fast as I hoped.

    Only one year between GPT 2 and 3.

    There is a gap of 3 years between GPT 3 and 4. plenty of time for any big tech to make their own version of it.

    Believe it or not, even the game Skyrim released in 2011 used some AI stuff for NPC's and all. (Lot of games too)
  • 1
  • 1
    @Grumm
    Thats Not AI. Not even close.
    If you define AI as a "random rule based behavior generator" - Then that program that won some chess games back in 1985 is an AI.
  • 0
    @Grumm one of the big issues that held back gpt4 was available hardware. The same issue is being faced today with even more parameter models.

    But it's being built, it's only a matter of time.
  • 1
    @Sid2006 LLMs are not a hype train. No more of a hype train than having SMEs ever was, except LLMs can do the work of millions of SMEs under minimal supervision.

    White collared workers should be planning a career exit in the next 10 years.
  • 1
    @magicMirror Well again, the term 'AI' is pretty large I guess. What do you think of an AI ?

    Are LLMs AI ? They just follow rules programmed by devs. Are chess or even the one that managed to beat a Go champion also AI ? They just follow rules created by humans to do a specific task faster. If I remember correctly, this is a huge problem at the moment. Do we want an AI that can implement his own functions and write his own program (like we do with our thoughts) Or do we want them to be only functional and as a tool to make jobs easier/faster ?
  • 0
    @Grumm
    AI means artificial inteligence. The first part is a bit vauge, but the second part is the actual problem. Inteligence is the ability to reflect, integrate new information, respond to new input, and - most important: self awareness.
    ML is not AI. Why? most ML approaches can't handle the drift. also - not aware. Can't reflect. Also - ML can "look" at a pic, and tell you if there is a banana in it. Or no banana. "Tomato? wtf is tomato?"
    LLMs are just the next logical step in the procedural generation approaches. Remever old rpg games, where the game generated a new dungeon for every playthrough? Is that AI?
    ChatGPT is no different then that. Same for stable diffuse generators. Impressive, sure - but ultimatly, a prng that feeds into an algorithem, that uses a prompt, and a rule set to generate an output.

    And now, The turing test. How many people can actually pass it? Actually able to administer it correctly? def not marketing hype people....
Add Comment