Inside China’s Smart-Home Industry: How Product Teams Navigate Privacy, Compliance and Culture
11 Jan 2025
Host by Shijing He
11 Jan 2025
Host by Shijing He
Image generated by ChatGPT
The connected-home boom is reshaping daily life far beyond Silicon Valley. China alone counted more than 78 million smart homes in 2023 and is projected to add tens of millions more over the next few years. While researchers have examined what users and bystanders think about being recorded by speakers, locks and cameras, remarkably little is known about the people who design and build those devices. What do Chinese engineers, UX designers and compliance officers believe about privacy? How do sweeping data-protection laws, Confucian family norms and relentless market pressure collide inside a product roadmap?
To find out, my co-authors and I interviewed 27 members of Chinese smart-home product teams—from algorithm engineers at start-ups to privacy specialists in state-owned giants. The conversations reveal a nuanced picture: genuine ethical concern, deep respect for new privacy statutes, but also tight timelines, cultural expectations about family hierarchy and a legal environment where national-security clauses often outrank individual rights.
Framing the Study
Three research questions guided our work:
How do Chinese smart-home product team members address privacy in their day-to-day tasks?
How do they perceive conflicts among the many stakeholders in a multi-user smart home—owners, guests, children, domestic workers, the state?
Which technical, social or legal strategies do they employ to balance convenience, safety and privacy?
Methods in Brief
We conducted semi-structured interviews in Mandarin, averaging 85 minutes, with professionals who had designed or shipped at least one smart-home device. Roles ranged from software engineers and UX designers to compliance lawyers and marketing leads. Participants came from companies of all sizes—small (<100 employees), medium (101-500) and large multinationals or state-owned enterprises (>500)—and covered a wide spectrum of products: cameras, voice assistants, doorbells, thermostats, baby monitors, even cooking appliances.
All conversations were transcribed, translated and thematically coded by three bilingual researchers. We iterated on a shared codebook until inter-coder reliability exceeded 0.88 (Cohen’s κ), ensuring consistent interpretation of the material.
What We Learned
1. Legal Compliance First—But Often at Launch, Not Design
Almost every interviewee described privacy review meetings with in-house lawyers as a non-negotiable milestone. Yet those meetings usually happened after core features were settled. Product managers spoke of “red-lining” illegal ideas at final review rather than embedding privacy principles from the outset. In other words, Chinese teams recognise the Personal Information Protection Law (PIPL) and related rules, but practical schedules keep privacy as an independent checklist, not a creative driver.
2. UX Designers Know Users—Less So the Law
UX and industrial designers voiced sincere concern for users’ feelings yet admitted limited training in data regulations. Many assumed legal colleagues would “take care of the paperwork,” reinforcing a compartmentalised workflow. One senior designer confessed she “wasn’t entirely sure which logs the camera back-end stored” because that belonged to cloud engineers and compliance staff.
3. Culture Shapes What Counts as “Normal”
Participants repeatedly noted that Chinese customers—and thus product roadmaps—are affected by three intertwined factors:
Collectivism and filial piety. Parents often expect to monitor children’s study posture with a camera-equipped lamp; adult children feel obliged to install sensors in elderly parents’ homes for safety.
Routine public surveillance. The omnipresence of CCTV and facial recognition in cities normalises cameras indoors; users seldom push back unless footage leaks.
Real-name SIM cards and super-apps. Linking devices to phone numbers is mandatory, lowering the perceived threshold for sharing personal identifiers.
Engineers internalise those norms. While Western developers debate the ethics of always-on cameras, Chinese teams may view them as expected functionality, albeit with stronger encryption or on-device processing to meet compliance.
4. Power Imbalances Complicate “User-Centred” Design
Teams recognised that owners control devices, guests do not, and that domestic workers or tenants rarely have bargaining power. Yet recruiting at-risk groups for field research proved hard: ethical approval, employer resistance and trust deficits shrank sample pools. As a result, several companies rely on proxy feedback—parents speak for children, managers speak for cleaners—leaving blind spots in feature design.
Developers also described “top-down” directives from platform giants. If a small vendor wants its kettle or lock inside a mega-platform’s ecosystem, it must meet strict audit checklists that sometimes exceed national law but prioritise brand integrity over tailored privacy controls.
5. Mitigation Strategies Span Code, Culture and Law
Teams deploy a mix of fixes:
Data minimisation—sending only posture skeletons, not raw video, to the cloud.
Granular roles—a “super-admin” account grants limited guest passes or domestic-worker windows.
On-device encryption with proprietary decoders, so even internal staff need special keys.
Anonymisation by design—millimetre-wave radars detect presence without capturing images.
Beyond code, interviewees urged community education. Several proposed privacy literacy seminars run by neighbourhood committees (居民委员会), printable handbooks shipped with devices and curriculum modules in compulsory schooling. They argued that informed consumers would pressure lagging brands and help shift corporate priorities. At the legal level, participants asked for clearer guidance on ambiguous PIPL terms (e.g., “separate consent”) and a tiered risk framework so small firms could budget compliance without hiring full-time lawyers.
Implications
For Designers and Engineers
Move privacy left. Treat compliance not as a release hurdle but as a design prompt. Storyboard how a guest will feel when the ambient lamp glows red and build opt-outs early.
Surface unseen stakeholders. Use diaries or remote ethnography to hear from nannies, elderly tenants and gig cleaners who cannot attend formal focus groups.
For Policymakers
Synchronise law and tech. Rapid AI-edge fusion makes annual guidelines obsolete. Consider shorter review cycles and sandbox pilots to align regulation with hardware launch tempos.
Reward good actors. Offer tax breaks or fast-track certifications to companies that prove by-design privacy, nudging the market rather than solely punishing breaches.
For Researchers
Go beyond WEIRD. The Chinese context shows how collectivist norms, state surveillance and platform ecosystems reshape privacy logics. Comparative studies across Asia, Africa and Latin America could reveal further diversity and suggest global design principles.
Privacy debates often pit Silicon Valley libertarianism against European human rights. China provides a third, more complex stage: rapid digitisation, ambitious data laws and deep-rooted cultural expectations of family oversight and state authority.
By listening to the product teams inside this ecosystem, we uncover pragmatic ethics: designers worry about shameful user experiences, engineers encrypt footage they themselves would hate to leak, and compliance officers juggle national-security requests with consumer trust. Yet resource gaps, power asymmetries and cultural scripts keep perfectly private smart homes out of reach—at least for now.