When Adolfo Rodriguez, Guitar Center’s CTO and CIO, was learning to play the six-string some two decades ago, he wanted to get his instrument to imitate the crunchy power-chords of AC/DC’s lead guitar player, Angus Young. He’d turn the knobs on his Gibson SG, yank the gain way up on his amp, and try pedals—only to figure out that Young rarely used pedals. Now he wants to help rockstar hopefuls find their sound quickly. In the coming months, Rodriguez said, an in-store large language model, trained on musical data and developed in partnership with an AI-service provider, will assist customers. He tested out the AI model recently. “I told it, ‘Hey, I play this ’83 Les Paul Studio, and I have a Boss ME-70 pedal. How do I sound like Angus Young?’ And it spits out to me all of the settings that I need to do to approximate Angus Young’s sound, as close as possible with that equipment…That’s amazing,” he said. Rodriguez imagines patrons using the capability in-store, with the help of a QR code. Guitar Center—a place where you grab any guitar off the wall and play it—also wants to bring digital tech like GenAI into its store to help employees and customers. As CTO, Rodriguez must find the right balance of adding a “generative” component that, at times, feels at odds with the creative process, generating ideas for the creator. Keep reading here.—BH |