• xp84 8 hours ago

    Interesting. Marshall Brain’s (RIP) Manna story had fast food restaurant management automated in this fashion, but I can see how customer service call centers fit the model quite well. We will probably see the “puppets” eliminated in 12-18 months when the voice models improve enough to be indistinguishable.

    Government is actually a surprising place to see this, since in theory they’re supposed to care about serving the citizens, as opposed to for instance, American health insurers, who in general would rather deflect and deny, which AI is perfect for.

    • raw_anon_1111 6 hours ago

      I worked at AWS in the ProServe division during the height of Covid, every state was so swamped with calls about various government services we had to automate calls as much as possible and figure out other ways to “deflect” calls from call center agents. This was before AI when you had go use old school intents based systems like Alexa, Siri and Google did pre AI.

      • chrisjj 7 hours ago

        Manna, yes!

        > We will probably see the “puppets” eliminated in 12-18 months when the voice models improve enough to be indistinguishable

        > Government is actually a surprising place to see this, since in theory they’re supposed to care about serving the citizens

        Well, it serves the citizens skilled only enough to work as puppets. The Govt. might say it also serves the users - cut cost per operator, increase number of operators, so reduce queue time. But then... Manna.

      • allinonetools_ 9 hours ago

        I have noticed this too recently. The responses feel structured but miss the actual intent of the question. It is a good reminder that without real understanding and context, the experience quickly becomes frustrating instead of helpful.

        • raw_anon_1111 6 hours ago

          I mentioned in another reply that I built call centers during and after my time at AWS for organizations (Amazon Connect), I assure you that the human you get online is no more knowledgeable than the chatbot. Even before AI they were using a knowledge base. Now services like Amazon Q for Connect automatically displays KB answers to the agents based on the conversation you are having with the human agent.

          • chrisjj 5 hours ago

            Thanks.

            Difference is, before, the human used domain knowledge to convert customer question into KB query. Now, that's done by "AI", so human needs no domain knowledge, nothing but the ability to perform the text-to-speech.

            • raw_anon_1111 4 hours ago

              My wife worked for Verizon soon after we got married as a CSR and I’ve witnessed first hand how they train CSRs at companies, the training is light and the turnover rate is high with the exception of people like 911 operators and they have workflows based on the type of call. Human CSRs are very much “automated”, scripted and get very little leeway to veer from the scripts.

              Besides struggling with speech to text more than you are use to even with Siri because of the inherent limitation of the bandwidth that is used to delivery your voice over phone lines, there is no reason a well designed LLM based system couldn’t do as well.

              But in my experience, there are a lot of people who specialize in call centers that don’t have a development background.

          • chrisjj 7 hours ago

            > the experience quickly becomes frustrating instead of helpful

            ... extending calls and thereby deteriorating queue wait time ... driving more puppet hires and raising costs!

          • lyaocean 10 hours ago

            The giveaway is usually failure recovery. Good human support can restate your question in their own words; scripted AI loops keep rephrasing the same wrong branch until you give up.

            • chrisjj 7 hours ago

              And her give up translated to hang up. Which ensures I don't get to answer the automated end of call "Rate me".

              The operator performance metrics must have fallen through the floor. Probably the target has been lowered to cover it.

              • raw_anon_1111 6 hours ago

                So can an LLM based agent…

                See my other replies. I know this space pretty well.

              • andyjohnson0 14 hours ago

                Please don't do this. Ask HN isn't your blogging platform. Per the guidelines its for asking questions of the community.

                • chrisjj 14 hours ago

                  > Please don't do this. Ask HN isn't your blogging platform. Per the guidelines its for asking questions of the community.

                  Guidelines: On-Topic: Anything that good hackers would find interesting.

                  ... which probably explains this post's +18 points.