While the generative artificial intelligence boom is well into its third year at enterprises like personal-computer manufacturer Lenovo, Global Chief Information Officer Arthur Hu says he still believes that patience remains a virtue.
“There’s a bit longer grace period, because people recognize that we haven’t quite figured it out yet,” says Hu. “There’s more willingness to invest in Gen AI, even if the result will be rapid learning.”
Many studies align with Hu’s thinking. Enterprises are spending more on generative AI in 2026 than in the prior year. CEOs are far more optimistic about producing a return on investment, even as C-suite leaders acknowledge that the ROI time horizon for generative AI is longer than other technology applications.
Lenovo and other PC makers have seen mixed benefits from rising AI demand. AI PCs accounted for 31% of the total global PC market globally by the end of 2025, according to research firm Gartner. This new technology upgrade cycle gives Lenovo and other manufacturers a fresh opportunity to pitch new hardware. In Lenovo’s fiscal third-quarter earnings, reported last month, total revenue grew 18%, driven by strong demand for AI-related products. AI revenue increased by 72% from the prior year and now accounts for a third of Lenovo’s total business.
But as large companies like OpenAI, Amazon, and Alphabet build data centers, demand for memory chip production has surged, putting supply pressure on consumer electronics. Gartner has warned that rising component costs will dampen sales for PCs and smartphones in 2026.
Hu’s relationship with Lenovo began ahead of the 2008 financial crisis when he worked on projects for the company in China in his role as an associate principal at consulting giant McKinsey. He joined when the China-based PC maker dreamed to get a lot bigger and become a more international company.
More than two decades ago, Lenovo acquired IBM’s PC division to get a firmer grip on the global market. It later diversified into smartphones and infrastructure by acquiring Motorola Mobility from Google and IBM’s x86 server business. Hu joined Lenovo in 2009 as a corporate transformation leader, playing a key role in some of the integration after that dealmaking. He became CIO in 2016.
“Each of these cases represented significant steps in the company operating model, and therefore the types of systems and infrastructure support we needed to run the company,” Hu says of his evolving role, which coincided with advances in AI and machine learning.
Hu says that by the mid-2010s, Lenovo was following an AI playbook that could be replicated during the current generative AI boom. The company committed to using AI across the entire enterprise, starting with smaller use cases, and then expanding across product development, manufacturing, and other functions.
Even then, Hu says he had to address worker fears about job replacement. He tends to frame Lenovo’s AI push around three key themes. The first is that because Lenovo’s business is growing at double-digit rates, it can launch and leverage internal AI productivity tools that may slow down hiring, but aren’t intended to reduce total headcount. Second, Hu stresses that employees aren’t expected to deliver dramatic productivity gains within mere weeks or months. He understands change takes time.
And finally, Hu aims to lessen binary thinking that all work is either done by human or a machine. Reshuffling tasks, Hu says, isn’t so simple. When working with software engineers, Hu encourages them to not fixate on how much new code is being written by AI tools, but instead look at the fuller scope of their workloads.
“The rest of the time is understanding the legacy code base, talking to stakeholders, you have to understand the API, what does the business analyst say?” says Hu. “All this other stuff makes up the other 85% of the job.”
Across Lenovo today, internal AI tools that have been deployed include Microsoft Copilot, AI coding assistants for software engineers, and AI-assisted conversational support for customer service representatives. Broader tools like Copilot typically require little training, but Hu says more upskilling is needed when rolling out more specialized, function-specific tools.
And Hu concedes that even with the longer-lead time for generative AI, boards and executive committees have been putting more pressure on technologists to accelerate deployment and extract ROI. There is only so much patience, after all.
“That’s good and bad, it certainly pushes the pace,” says Hu. “People want to invest, but ultimately the economics have to work. Money isn’t free, as our CFO and finance community likes to remind us.”
