- within Technology topic(s)
- in United States
- with readers working within the Healthcare, Technology and Retail & Leisure industries
Ajit Dansingani is HP's senior vice president of strategy and business planning across the company's personal computing and gaming devices sectors. He has developed the tech hardware giant's on-device AI strategy, encompassing silicon strategy, AI experience offerings, and new product opportunities.
As AI's influence expands across the computing industry, tech executives are hard-pressed to find more immediate use cases that show near-term ROI while investing heavily in long-term AI applications that may revolutionize consumer technology. Ajit sat down with Sanjay Verma, AlixPartners' leader of digital and product innovation, to discuss how AI will evolve personal computing and how companies can apply the technology to future-proof their businesses.
What are you seeing in terms of AI revolutions? What do you think the impact will be on the tech industry in the next 3-5 years?
I see three big shifts. First, reasoning and tool-using models turn AI from a chatbot into a problem-solver—planning, calling APIs, and closing loops. Second, AI becomes personal and context-aware — it remembers your work, your preferences, and acts across apps. Third, cost curves bend: smaller, efficient models run locally; heavy lifting bursts to the cloud. The net impact is twofold: every software category is rewritten around automation, and most companies establish a dedicated AI platform team.
How do you see AI evolving the demands of the personal computing industry?
We're moving from "point-and-click" to intent-and-outcome. I expect three practical changes:
- Assistants become a first-class part of the operating system. You trigger them by hotkey or wake word, they understand what's on screen, they act across apps, and they show results inline.
- Always-on context: the PC understands intent, tasks, files, and meetings, and proposes next actions.
- Performance/power tuned for AI bursts—snappy local
inference without killing battery. Success means simpler
interactions and faster outcomes.
In short: the personal computer finally becomes personal — a productivity companion that understands your goals, context, and preferences.
Are there new ways in which people will use and communicate with technology as AI becomes more ubiquitous?
I really believe the way we interact with technology will fundamentally change. Our AI interactions will become ambient and multimodal, meaning you can speak, type, point, or show—whichever is fastest in the moment. Context flow will be simple: your work and preferences carry across devices. And systems get more emotionally intelligent—picking up on tone, urgency, and focus—so they adjust pace, verbosity, and notifications. The result: the interface feels invisible, intuitive and supportive.
Is there a difference in terms of how agentic AI could have an impact on the personal computing industry?
Agentic PCs are inevitable—the user interface is shifting from clicking steps to delegating outcomes. You ask for results—"prep the board pack," "book travel within budget"—and the agent executes across apps and data, then reports back. The differentiator is trust. Guardrails define what the agent can and cannot touch, and provenance shows what it did and why. This will also require a permissions model that users inherently trust to allow agents to operate across your files, communications, and accounts.
Does AI and its power and data needs alter the way the personal computing architecture will change? If so, what are some of the ways that may evolve?
Yes. With greater adoption of AI, we will have increasing demand for AI inferencing, which will use up a lot of power and AI cycles. A hybrid compute approach is likely to become the default: the CPU handles control flow, the xPU (GPU, NPU, TPU) handles inference, and low‑power islands handle background tasks. This enables PCs to perform complex AI tasks without excessively draining the battery. Memory bandwidth and fast storage will matter more than raw clocks. You'll also see secure enclaves for private context and a hybrid orchestration/placement engine that decides what runs local vs.cloud to balance latency, cost, and battery.
What are some of the new use cases and new form factors in personal computing you think are evolving due to AI?
There are probably three broad categories of new use cases and form factors:
- Workflows: Today humans bridge the work between isolated software islands, and the workflow is dependent on humans. A key use case is to use agents for enabling the workflow between steps and bring the human in the loop for more impactful involvement.
- Wearables: Lightweight devices that can provide intelligence on demand, such as providing live translation, or letting a remote assistant "see what I see" for remote support and training.
- Ambient devices: A suite of home, desk, or room devices that interact with your agents to customize the experiences around you and initiate actions, such asstarting a meeting, pulling the brief, or drafting a reply, so your workflow has a running start.
Does the rise of AI lead to a shift where the personal computing industry has a new set of opportunities to explore? If so, what do you believe those could be?
Absolutely—differentiation moves to experiences, not just specs. Think "PC + software assistants + services" bundles, privacy-grade on-device features, and vertical solutions (creator, student, developer SKUs) where hardware, models, and workflows are tuned end-to-end.
How are hardware manufacturers embedding AI? What are some of the key advantages they can leverage when competing with the large software players?
This is where customized hardware and software deliver a better experience together:
- Latency & battery: xPUs/FPGAs optimized for common AI tasks.
- Privacy: Sensitive inference stays local.
- Sensors & signal chain: Better microphones/cameras/image signal processors mean better AI.
- Deep OS integration: Hotkeys, wake word, context capture. Bundle that with targeted model tuning and you deliver unique experiences that's only possible with a combination of silicon and sensors.
How is AI changing business models for the personal computing industry? What will be the role of AI-powered services and offerings going forward?
AI is enabling PC OEMs to explore adjacent opportunities. For example, HP's Workforce Experience (WXP) service is a value-added offering that lets IT manage and monitor a variety of end‑user devices, with AI capabilities—such as proactive remediations—that improve customer and end‑user experience. This leads to fewer tickets and higher device uptime.
Are there any new AI-driven offerings or service opportunities for personal computing businesses that we have not talked about?
I can think of a few:
- Personal data vaults: Private indexing of your life with trusted sharing.
- Enterprise knowledge workspace (private RAG):A secure, on-device/edge-first index over docs, email, chat, and wikis with policy-aware retrieval and citations.
- Creator accelerators: Personalized/style-locked pipelines for video, design, and code, which brings GenAI capabilities that are dynamically tuned to your individual style, workflow, and preferences.
- Device lifecycle AI: Performance tuning and proactive failure management.
What types of investments are required by personal computing (e.g., chips, memory, power efficiency, AI models, services, etc.) to go after these opportunities?
There are a broad range of areas that would require investment. On the system side we'll need silicon in the form of power-efficient inference accelerators and fast memory, sensory input (low-power wake words, mic arrays, cameras), and OS capabilities (context APIs, secure stores, permissions, universal AI runtimes). On the AI side we'll need small, high-quality local models; data pipelines for continuous fine-tuning; eval/safety tooling; and MLOps to ship models as reliably as system updates.
How is the industry managing the trade-off between on-device AI processing and cloud-based AI?
Like past computing shifts, AI will run in a hybrid model: some tasks on‑device, others in the cloud. The split varies by industry and depends on three things—privacy/compliance, performance (latency and quality), and economics (compute cost). In practice, run on-device when privacy or latency matters, and burst to the cloud for heavy, long‑context, or shared tasks.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.