40 Common Thesis Defense Questions and How to Answer It

writing

August 3, 2025

Welcome to your thesis defense guide. This blog distills that empirical data into a chapter‑by‑chapter defense guide, offering model Q&A responses rooted in meaningful theory and local context—to help you confidently convey the real-world relevance and scholarly rigor of your study.

CHAPTER 1 THESIS DEFENSE QUESTIONS AND ANSWERS

1. Why did you choose this topic?

🧠 Committee focus:
Explain your personal, academic, or social interest in Filipino Gen Z, digital literacy, and mental health; highlight graduate-level importance; cite gaps in existing national data.

🗣️ Sample Answer:
Growing up in Metro Manila amid remote schooling during the COVID‑19 era, I witnessed my peers—mainly Gen Z students—showing strong digital fluency but also clear signs of stress tied to device overuse. A 2024 survey of 384 Filipino college students found high digital engagement, with a significant portion reporting “digital stress” and maladaptive coping patterns such as sleep disruption and multitasking (Giray, 2024). Meanwhile, technostress—the anxiety and overload caused by constant online connectivity—has been shown to reduce online learning quality across global studies (Saleem et al., 2024). Despite extensive mental health data in the Philippines, few studies examine how digital‐skill levels (beyond mere access) correlate with stress‑related symptoms in young adults. I chose this topic to investigate that specific intersection—digital literacy as a potential buffer against stress among Filipino Gen Z—since this focus could yield actionable insights for educators, counselors, and policy makers.


2. What is the main problem your thesis addresses?

🧠 Committee focus:
Define the specific research gap: digital literacy proficiency, not just access, and its influence on stress; emphasize lack of local research on this link.

🗣️ Sample Answer:
The central problem this study addresses is twofold. First, while “digital access” in the Philippines has dramatically improved post‑2020, large gaps remain in digital literacy—especially critical evaluation and safe content creation—among Gen Z students (Barrot et al., 2021). Second, although technostress is well documented as a global concern for students (Saleem et al., 2024), there is little empirical evidence in the Philippine context on whether digital literacy proficiency mitigates perceived stress. Most existing local studies treat digital skills and mental health as separate domains. As a result, educators and policy makers lack evidence on whether improving students’ critical digital competencies can reduce their stress levels. This study aims to fill that gap by linking digital literacy scores with self‑reported stress and technostress dimensions among Filipino Gen Z university students.


3. What is the significance of your study?

🧠 Committee focus:
Show practical and academic contributions: support curriculum design, digital wellness policy, counseling practices, and Filipino Gen Z mental health evidence.

🗣️ Sample Answer:
This study makes a novel contribution to both scholarship and practice by exploring the intersection between two vital yet siloed domains in Philippine education: digital literacy proficiency and student mental health. Internationally, technostress has been shown to degrade online learning and amplify anxiety (Saleem et al., 2024). Locally, however, research such as Barrot et al. (2021) indicates widespread challenges in technological readiness and psychological well‑being among Filipino college students (Barrot et al., 2021). By empirically testing whether digital literacy subdimensions—such as information evaluation and tool fluency—predict lower stress, the study yields evidence that can inform public education policy, university digital wellness initiatives, and counseling interventions. Further, it contributes to the academic literature as one of the first Filipino studies to link proficiency in digital competencies with stress mitigation rather than just digital access or usage rates.


4. What are your research questions or hypotheses?

🧠 Committee focus:
State clear RQs and testable hypotheses: relationship between literacy and stress, plus influence of device‑use routines.

🗣️ Sample Answer:
This quantitative study is structured around three research questions (RQs) and two testable hypotheses:

RQ1. What levels of digital literacy—across subscales for device fluency, critical information evaluation, and content creation—are exhibited by Filipino Gen Z university students?
RQ2. What are their perceived stress and technostress symptom levels?
RQ3. Is there a statistically significant correlation between digital literacy (overall and subscales) and perceived stress levels?

H₁. Higher digital literacy, particularly in critical evaluation and safe use, will be associated with lower overall perceived stress.
H₂. Specific digital routines (such as nighttime device use, multitasking, or frequent social media posting) will be positively correlated with technostress symptoms.

These RQs and hypotheses operationalize the study’s aim to map literacy and stress levels and examine their interplay in the Philippine Gen Z context.


5. How did you come up with your research objectives?

🧠 Committee focus:
Demonstrate logical flow from gap to objectives; ensure each objective is measurable and aligns with RQs.

🗣️ Sample Answer:
I derived three precise objectives directly from the conceptual gaps identified:

  1. To quantify digital literacy levels—specifically device fluency, critical evaluation, and content creation—using a validated Philippine‑normed instrument.
  2. To assess perceived stress and technostress symptoms using established psychometric scales adapted for Filipino student samples.
  3. To analyze statistical relationships between digital literacy and stress outcomes, including moderating roles of device‑use routines and socio‑demographic factors such as household income, internet access quality, and academic year.

These objectives follow from the research questions and provide a concrete roadmap: first measure literacy, then measure stress, then test correlations controlling for relevant covariates. Achieving them enables evidence‑based recommendations for educational interventions and mental health support tailored to Gen Z youth in the Philippines.


6. Who will benefit from your study/thesis, and how?

🧠 Committee focus:
Identify key stakeholders: Gen Z students, educators, counselors, university administrators, mental health policy makers; describe specific uses.

🗣️ Sample Answer:
This research informs multiple stakeholders:

  • Gen Z university students will gain self‑insight on how digital habits and competence relate to stress—helping them adopt healthier digital routines.
  • Educators and curriculum designers can use the results to reshape digital literacy instruction, emphasizing not only tool use but also critical online resilience and self‑regulation.
  • University counseling centers will receive data on which problematic digital behaviors (like screen overuse before bed or multitasking) correlate with higher stress, supporting targeted psychoeducational workshops.
  • Higher education policymakers may leverage findings to justify inclusion of digital wellness modules in DepEd and CHED frameworks.
  • Parents and student support services benefit from culturally relevant insights that link digital skill deficits to mental health risks in young Filipinos.

By bridging digital proficiency with emotional well‑being, the study provides actionable, culturally grounded guidance across education and mental health sectors.


7. What are the limitations of your study?

🧠 Committee focus:
Be candid about design (cross‑sectional), self‑report bias, sampling scope (Metro Manila), rapidly changing digital environment, unmeasured confounds.

🗣️ Sample Answer:
The study has important limitations that warrant caution and suggest future direction:

  • Cross‑sectional design: Offers a snapshot of correlation, not causation; longitudinal tracking would be needed to examine directional effects.
  • Self‑report measures: Both literacy and stress scales are subject to recall bias or social desirability, despite including validity checks.
  • Sampling scope: The study focuses on Gen Z students from three Metro Manila universities, limiting generalizability to rural or Visayas/Mindanao populations.
  • Rapid digital evolution: By 2027 or 2028, platforms and usage norms may have shifted significantly—affecting replicability.
  • Unmeasured variables: Factors like pre‑existing mental health conditions, family support, or campus counseling access were not captured but may influence stress levels.

Despite these constraints, the study contributes foundational data and suggests avenues like longitudinal, mixed‑methods, and broader regional studies for future researchers.


8. How does your title reflect the content of your thesis?

🧠 Committee focus:
Highlight how each element of the title aligns with study elements: digital literacy, Gen Z, Philippines, stress, technostress, and quantitative design.

🗣️ Sample Answer:
The title—“Digital Literacy and Stress Levels of Gen Z in the Philippines: A Quantitative Analysis of Technostress and Device-Use Patterns”—was deliberately crafted to reflect exactly whatwhowhere, and how:

  • “Digital Literacy” specifies evaluation of actual competency—not merely access or familiarity.
  • “Stress Levels” highlights both general perceived stress and specific technostress symptom domains.
  • “Gen Z in the Philippines” clearly defines the demographic cohort (approximately ages 18‑24) within a unique national and cultural context.
  • “Quantitative Analysis” signals that the study employs validated psychometric instruments and statistical hypothesis testing.
  • “Technostress and Device‑Use Patterns” succinctly conveys the independent variables under investigation—tying platform routines to mental-health outcomes.

Altogether, the title succinctly communicates the scope, population, and methodological rigor of the research.

CHAPTER 2 SAMPLE THESIS DEFENSE QUESTIONS AND ANSWERS

1. What is the theoretical or conceptual framework of your thesis?

🧠 Guide:
Explain the hybrid model grounding your study—specify who developed each theory, how they interlock, and why it fits your population/context.

🗣️ Sample Answer
My conceptual framework combines Ng’s (2012) tripartite model of digital literacy—technical operational fluency, cognitive evaluation/creation, and social/ethical communication—with the technostress theory as described by Kumar (2024), emphasizing emotional and physiological strain stemming from digital demands. In Ng’s framework, digital literacy is not just about tool use but includes the ability to critically evaluate online information and communicate responsibly in digital spaces—skills which map directly to students’ everyday behaviors (Ng, 2012). Kumar defines technostress as negative psychological strain resulting from pressures such as information overload, insecurity, and rapid tech changes (Kumar, 2024). This hybrid model situates digital literacy sub‑skills as independent variables influencing perceived stress and technostress symptoms. Additionally, digital device-use routines (e.g., bedtime social media use, multitasking, platform switching) are conceptualized as mediators that either magnify or buffer the stress‑literacy relationship. This framework is especially relevant to Filipino Gen Z university students because Giray et al. (2024) documented high rates of device engagement and corresponding digital stress in this age group. By integrating literacy and stress, the model allows for hypothesis testing of whether stronger literacy sub‑scores predict lower technostress—and whether certain routines weaken or strengthen that effect. It therefore aligns precisely with both the research questions and the practical goal of informing digital wellness interventions for Gen Z in Philippine universities. 


2. Why did you choose that framework over other models?

🧠 Guide:
Justify why Ng (2012) and Kumar (2024) provide the best conceptual alignment. Contrast with alternative models and clarify the fit for your population.

🗣️ Sample Answer
I selected Ng’s (2012) digital‑literacy framework because it offers a validated, multidimensional approach that transcends simplistic notions of access or basic technical skill—it specifies critical evaluation and communication ethics, which are crucial for youth navigating misinformation and social norms online (Ng, 2012). Competing models like Ferrari (2012) or UNESCO’s ICCS offer broader skill categories, but often lack the parity between technical fluency and ethical communication needed for stress‑related outcomes. Meanwhile, technostress theory—as synthesized by Kumar (2024)—is among the most comprehensive and up‑to‑date in capturing the reversed emotional impacts of modern digital life. Unlike general stress models (e.g., Lazarus & Folkman, 1984), it specifically addresses stressors like platform overload and technological insecurity which students face daily. Moreover, Kumar’s framework differentiates between “hindrance”, “challenge” and “eustress” stressors, allowing our study to empirically test not just stress but also adaptive resilience in routines. Filipino college student studies (e.g. Giray et al., 2024) identified high device dependence and psychological strain—symptoms that align directly with the technostress construct. Therefore, pairing Ng’s model with Kumar’s technostress theory provides the most conceptually coherent and empirically grounded basis for examining how literacy mitigates stress, especially in the Gen Z Philippine context. 


3. What are the key constructs or variables derived from the literature?

🧠 Guide:
List independent, dependent, and mediator/moderator variables with definitions tied to cited models.

🗣️ Sample Answer
Drawing on Ng (2012) and Kumar (2024), the key constructs are:

  • Independent Variables:
    1. Technical Literacy – ease of device manipulation, tool navigation, and digital operation skills.
    2. Cognitive Literacy – ability to critically evaluate online information, create digital content, and integrate knowledge.
    3. Social Literacy – responsible communication, awareness of cyber‑etiquette, and security/privacy practices (Ng, 2012).
  • Dependent Variables:
    1. Perceived Stress – general psychological stress related to digital overload.
    2. Technostress Symptoms – emotional or physiological responses such as anxiety, fatigue, irritability triggered by technology use (Kumar, 2024).
  • Mediator/Moderator Variables:
    • Device‑Use Routines – behaviors like device use before bedtime, frequent platform switching, and multitasking. These can act as stress amplifiers (hindrance stressors) or resilience builders (challenge stressors) depending on context (Kumar, 2024).
  • Control Variables:
    • Demographics such as age, gender, income, internet access quality, academic discipline, drawn from Giray et al.’s (2024) survey which found these moderate digital stress levels.

This constellation allows for quantitative testing of the hypothesis that higher levels of digital literacy—especially cognitive and social dimensions—will predict lower perceived stress and technostress, while specific routines will either buffer or exacerbate the relationship.


4. How did you operationalize your constructs based on the literature review?

🧠 Guide:
Describe survey instruments, their origins, subscale items, reliability/validity, and how they map onto your variables.

🗣️ Sample Answer
Each construct was operationalized using validated, literature-backed surveys. Digital literacy was measured via a Philippine‑adapted instrument based on Ng’s three-domain model: technical literacy (e.g. “I can operate camera, change settings, manage files”), cognitive literacy (e.g. “I check multiple sources before believing online content”), and social literacy (e.g. “I follow privacy guidelines when posting”)—each with 5 items rated on a 5‑point Likert scale. The scale was pre‑tested for internal consistency (Cronbach’s α > .80 per subscale), aligning with previous implementations in Southeast Asia. Perceived stress was assessed using the PSS‑10 item scale adapted for digital context; technostress was measured using Kumar’s typology with “hindrance” and “challenge” stressor subscales (e.g. “I feel overwhelmed by too many app notifications”). Device‑use routines were recorded via behavior logs—students reported average daily hours, frequency of night‑time phone checks, and rate of multitasking across platforms. Demographic and internet‑access data were included as control variables. These operational definitions tie directly back to the conceptual framework: literacy scores drive hypothesis testing, technostress items capture outcome, and routines test mediation. Piloting confirmed convergent validity with stress outcomes (r = −.45 between technical literacy and technostress), further validating the mapping.


5. What are the major findings from prior studies that informed your literature review?

🧠 Guide:
Summarize 3–4 key empirical findings from both Philippine and global literature, and explain how they influence your study’s direction.

🗣️ Sample Answer
Several prior studies shaped the literature groundwork:

  1. Giray et al. (2024) surveyed 384 Filipino college students and found heavy digital device engagement—especially social media and video platforms—correlated positively with dig­i­tal stress, fatigue, sleep disruption, and reduced academic focus (Giray et al., 2024). This provided foundational evidence of stress among Filipino Gen Z.
  2. Kumar’s (2024) review on technostress noted that high levels of digital overload, security concerns, and incessant platform switching were robust predictors of emotional exhaustion, burnout, and anxiety across global university samples (Kumar, 2024).
  3. Globally, Mohammadyari & Singh (2015) found that individuals with stronger cognitive digital literacy skills reported better online learning performance and lower frustration under remote learning—suggesting protective benefits of literacy (reported in Ng, 2012 synthesis).
  4. A Bangladeshi study by Hossain et al. (2021) indicated that students with limited digital competence experienced heightened fear of academic delay and greater stress in fully online learning environments (Hossain et al., 2021).

These findings collectively indicate: high device engagement often leads to stress, but strong digital literacy—especially cognitive resilience—can mitigate it. They confirmed the need to study both sides simultaneously: literacy and stress. The Philippine context’s manifest device-engagement–stress link and the protective global evidence of literacy competency shaped both the research questions and conceptual model used for this study. 


6. Were there conflicting findings in the literature, and how did you reconcile them?

🧠 Guide:
Highlight contradictory or mixed results (e.g., literacy not always protective; stress sometimes decreased in high-literate but high-use groups), and explain how you accounted for this in your model—e.g., by modeling interactions or exploring nonlinear effects.

🗣️ Sample Answer
Yes, the literature revealed some nuanced or conflicting findings that shaped my framework. For instance, while several studies show that stronger digital literacy correlates with lower technostress in online learning settings (Mohammadyari & Singh, 2015), other research—particularly among highly connected youth—suggests that even individuals with strong technical skills report stress when platform-use routines are excessive (Giray et al., 2024). In some global samples, challenge-based technostress (like using new technology to solve tasks) sometimes correlates positively with motivationand engagement, complicating the assumption that all technostress is harmful (Kumar, 2024). These contradictions led me to distinguish between hindrance stressors (e.g. information overload, insecurity) and challenge stressors (e.g. complexity that leads to skill-building), based on the refined technostress model. I therefore included interaction termsbetween digital literacy subscales and device‑use routines in the analysis to test whether literacy mitigates stress only when routines reflect challenge rather than overwhelm. Additionally, I included quadratic terms in regression models to detect non‑linear effects—i.e. literacy may protect until routines exceed a threshold. This reconciled the conflicting findings by allowing the model to differentiate when literacy is protective and when even literate users still feel stressed due to excessive use. It also lent nuance to policy recommendations—suggesting literacy training alone isn’t enough without healthy device practice guidelines.


7. How did your literature synthesis lead to your hypotheses and research design?

🧠 Guide:
Explain how thematic synthesis of findings allowed you to specify research questions, relationships, and method. Show flow: literature gap → conceptual model → hypotheses → survey design.

🗣️ Sample Answer (≈225 words):
Literature review revealed two core gaps: a) Filipino Gen Z exhibits high levels of device engagement and stress with little study linking literacy proficiency to outcomes (Giray et al., 2024), and b) technostress theory offers nuance on different stressor types yet lacks application in Philippine youth samples (Kumar, 2024). Synthesizing these, I adopted Ng’s (2012) digital literacy framework to organize sub‑skills and Kumar’s stress model to conceptualize outcome pathways. From this synthesis, the research questions were naturally derived: (1) What are literacy levels among Filipino Gen Z? (2) What are stress and technostress symptom levels? (3) How are they associated—does literacy dampen stress—and (4) do device routines moderate that association? Correspondingly, two testable hypotheses were framed: H₁: Higher literacy predicts lower technostress; H₂: Higher-risk routines (e.g. nighttime use, multitasking) amplify stress unless buffered by high literacy. The survey instrument was then constructed using validated digital literacy and technostress scales, and routine items were informed by behavioral findings from previous studies (Giray et al., 2024; Kumar, 2024). This unified approach—literature synthesis → conceptual model → hypotheses → operational definitions—ensured both theoretical coherence and empirical rigor in the research design.


8. How does your conceptual framework guide your analysis and interpretation of results?

🧠 Guide:
Describe step-by-step how statistical modeling follows the framework: correlation, regression, mediation, moderation, subgroup analysis.

🗣️ Sample Answer
The conceptual framework structured the analytical plan in three phases:

  1. Descriptive Analysis & Correlation Matrix – To profile Gen Z literacy scores, stress levels, and digital routines, and to examine bivariate associations as predicted by theory. For instance, we expected negative correlations between cognitive literacy and technostress.
  2. Multiple Regression & Hierarchical Models – To test the hypothesis that literacy subscale scores significantly predict lower perceived stress after controlling for demographics and internet access. The technostress subscales (hindrance and challenge stressors) were also included as dependent variables to test specificity.
  3. Interaction and Moderation Tests – Guided by the technostress distinction, interaction terms between literacy scores and routine variables were added. This allowed testing whether the protective effect of literacy weakened under high-risk routines (e.g. daily social media binge in the pre-sleep hour) or stayed robust in challenge-like routines.
  4. Subgroup Analysis by Discipline and Access Quality – Since Giray et al. (2024) found variance in stress by income and connectivity, the model compared effects across those groups to interpret whether the framework held under differing access conditions.

The framework ensures that interpretation of results is aligned with theoretical pathways—demonstrating whether findings support the literacy buffer, reveal thresholds where routines dominate, or show differential patterns across access strata. This approach avoids purely descriptive interpretation and grounds conclusions in the original hybrid model. 

CHAPTER 3 SAMPLE THESIS DEFENSE QUESTION AND ANSWER

1. What research design did you adopt and why?

🧠 Guide:
Explain your choice of cross‑sectional survey design, why quantitative method fits your RQs, dataset scale, and how it aligns with past studies in Philippine contexts.

🗣️ Sample Answer:
To address our research questions, we adopted a cross‑sectional quantitative survey design. Cross‑sectional surveys are ideal for mapping digital literacy and stress across a broad, demographically stratified Gen Z sample at one point in time—permitting efficient correlation and multivariable modeling without the complexity of longitudinal follow‑up (Lim et al., 2022). This design aligns with mental‑health research among Filipino university students during and after the pandemic (Lim et al., 2022). It also mirrors technostress studies in higher‑education settings where cross‑sectional formats yielded valid moderation findings (Vega‑Muñoz et al., 2022). Quantitative data allows hypothesis testing concerning relationships—such as whether cognitive digital literacy predicts lower technostress—while remaining manageable and ethically feasible in an online environment. A mixed‑methods design was considered but ultimately deprioritized due to time and logistical constraints. Using an online Google Form (see Q6) permitted scale deployment across multiple institutions, protecting anonymity and reducing contact—critical during PHE/remote learning settings (Lim et al., 2022; Vega‑Muñoz et al., 2022). Overall, this design delivers rigor, relevance, and reproducibility within the Philippine Gen Z cohort.


2. How did you select your participants or respondents?

🧠 Guide:
Describe your target population, inclusion/exclusion criteria, and rationale. Highlight why selecting Gen Z Filipino university entrants matters.

🗣️ Sample Answer:
The target population comprised Gen Z students (ages 18–24) enrolled in undergraduate programs at three universities in Metro Manila, representing private and public, technical and liberal arts disciplines. We used stratified convenience sampling to ensure representation across gender, academic year, and digital‑access types (e.g., fiber vs. mobile data), informed by prior work showing digital stress variance by access quality (Lim et al., 2022). Inclusion criteria were current enrollment in the first or second year, regular use of smartphones/laptops, and self‑identification as Filipino. Exclusion criteria included medical leave during data collection or existing diagnosed mental health poor enough to disrupt informed consent participation. This approach balances feasibility—given our access to university mailing lists and online classrooms—and adequate diversity, while remaining ethically transparent. Although probability samplingwould offer greater generalizability, stratified convenience sampling allowed us to reach N ≈ 400 targets expediently and safely under current health protocols. Prior Filipino student stress studies found sample sizes of 200–400 sufficient for stable estimates (Reyes & Resuento, 2023). Ultimately, this sampling strategy supports internal validity and subgroup comparisons by discipline or access type.


3. What sampling method and sample size did you use and why?

🧠 Guide:
Explain sample size calculation, expected effect sizes, power, and justify sampling errors. Show the committee you anticipated power issues and sampling error.

🗣️ Sample Answer:
We targeted a minimum sample size of 384 based on Cochran’s formula for a 5% margin of error and 95% confidence level for a finite population of Gen Z students across three universities (estimated combined N ≈ 20,000). Adjusting for an anticipated 80% response rate led us to send out questionnaires to roughly 500 students. Ultimately, we obtained N = 412usable responses. This meets the convention of at least 10 respondents per predictor for regression models and satisfies sample‑to‑item ratio for scale validation (Taber, 2018) —especially given our roughly 30 items across literacy and stress scales (Taber, 2018; Tavakol, 2011). The sample size also exceeds thresholds used in similar Philippine studies (e.g., N ≈ 384 for device‑engagement and stress surveys) (Giray et al., 2024). We employed stratified convenience sampling to ensure representation across gender, internet‑access type, and academic discipline. Although probability sampling would enhance external validity, the size and diversity of our stratified sample allow for internally consistent and statistically significant findings across multiple literacy and stress subscales.


4. What instruments or tools did you use for data collection?

🧠 Guide:
Describe each instrument: how it was adapted, number of items, domains, source, and relevance.

🗣️ Sample Answer:
Three main instruments were used:

  1. The Digital Literacy Survey, adapted from the Digital Literacy Scale (DLS) developed by Avinç & Doğan (2024), covering technical, cognitive (critical evaluation/content creation), and social/ethical dimensions with 20 Likert‑type items (Avinç & Doğan, 2024). It offers robust Rasch‑based validation and cross‑cultural adaptability, making it suitable for Filipino Gen Z youth.
  2. The Perceived Stress Scale‑10 (PSS‑10), validated among Filipino university students by Reyes & Resuento (2023), with Cronbach’s alpha = 0.81, attests to its reliability and cultural relevance (Reyes & Resuento, 2023).
  3. The Technostress Questionnaire, adapted from the Technostress Creators/Inhibitors framework and translated for student populations in Chile and Latin America (Vega‑Muñoz et al., 2022), with 19 items focused on techno‑overload, complexity, insecurity, etc. It has shown strong internal consistency in prior university students (Vega‑Muñoz et al., 2022).

We also included a digital-behaviour log (self‑report of nightly screen use, number of platforms used, etc.) and demographics/internet‑access section. The entire survey was delivered via Google Forms for convenience and confidentiality.


5. How did you ensure validity and reliability?

🧠 Guide:
Discuss content validity, construct validity, internal consistency metrics (Cronbach’s alpha), pilot testing, and how you accounted for threats to validity.

🗣️ Sample Answer:
Content validity was first ensured by consulting three local experts in digital education and psychology, who reviewed items for cultural appropriateness and clarity—refining wording based on iterative feedback. For construct validity, we performed exploratory (EFA) and confirmatory factor analysis (CFA). The DLS’s tri‑factor structure and the technostress questionnaire’s three dimensions were supported (CFI > 0.95, RMSEA < 0.06), consistent with prior validation studies (Avinç & Doğan, 2024; Vega‑Muñoz et al., 2022). For reliability, we calculated Cronbach’s alpha per subscale; all values ranged between 0.78 and 0.89, comfortably within the “good” range of 0.70–0.90 (Taber, 2018; Tavakol, 2011). The PSS‑10 scale Cronbach alpha was 0.81 among our respondents, aligning with Filipino validation data (Reyes & Resuento, 2023). Pilot testing on 30 students resulted in minor wording adjustments and confirmed that the average completion time was under 15 minutes—helpful for minimizing respondent fatigue and dropout. Together, these steps ensured the instruments were both valid and reliable within our sample and culturally resonant.


6. What data collection procedures did you follow?

🧠 Guide:
Outline step-by-step: instrument translation/adaptation, pilot testing, online platform setup, informed consent process, data security, and timing.

🗣️ Sample Answer:
Data collection occurred between March and April 2025 via online Google Forms. First, the original English DLS and technostress scales were reviewed by bilingual experts for clarity and relevance. Pilot testing was performed with 30 Gen Z participants who completed the survey on smartphones or laptops; feedback led to minor edits and confirmed the estimated 10–15 minute completion time. The finalized survey began with a mandatory informed‑consent page, requiring participants to click “Agree” before advancing (Qualtrics, 2020). Consent outlined voluntary participation, anonymity, data use, and contact for queries. After consent, participants answered demographics, followed by the DLS, technostress scale, and PSS‑10. Participants were instructed to close and restart if devices froze, and a unique code ensured no duplicate entries. No personally identifying information (PII) was collected. Data were stored in a password‑protected Google Drive folder accessible only to the principal investigators. Reminders were sent through university mailing lists and forums at days 7 and 14 post‑launch, which helped sustain participation. Institutional Research Ethics Office (IREO) clearance was obtained prior to deployment, and all procedures complied with local ethical guidelines for online youth research (Lim et al., 2022; Qualtrics, 2020).


7. What data analysis methods did you use?

🧠 Guide:
Explain statistical procedures: descriptive analysis, reliability testing, correlation, regression, moderation analyses (interaction terms), control variables, software used.

🗣️ Sample Answer:
We conducted four major analysis phases using SPSS v26 and PROCESS macro v4.1 for moderation testing.

  1. Descriptive statistics: Means, SDs, and frequency distributions provided demographic and literacy/stress profile.
  2. Reliability tests: Cronbach’s alpha and item‑total correlations for each scale subcomponent ensured internal consistency (α = 0.78–0.89).
  3. Correlation and multiple regression: Pearson correlations evaluated bivariate relationships (e.g. cognitive literacy vs. technostress). Regression models tested whether each literacy subscale significantly predicted perceived stress and technostress symptoms while controlling for income, internet access type, and academic discipline.
  4. Moderation analysis: Based on Kumar’s technostress model, we created interaction terms between device‑use routines (e.g. nightly screen time, multitasking frequency) and literacy scores to see if routine behaviour moderated the literacy‑stress relationship. PROCESS model‑1 tested whether high literacy buffered stress under challenging routines.

Adequacy of assumptions (e.g., homoscedasticity, VIF < 3) was checked before interpreting results. Effect sizes followed Cohen’s conventions. All tests used α = 0.05. The framework‑driven analytic plan provided evidentiary consistency with prior technostress and digital literacy studies (Vega‑Muñoz et al., 2022; Taber, 2018) .


8. What ethical considerations did you apply in your thesis?

🧠 Guide:
Show formal ethics clearance, consent, data protection, anonymity, benefits vs. risk, and special considerations for Gen Z youth.

🗣️ Sample Answer:
This research received approval from the university’s Institutional Review Board (IRB) prior to data collection. Ethical practices upheld Informed Consent, where participants were given a clear overview of the study’s purpose, voluntary nature, and data usage before they could proceed (Qualtrics, 2020) . No PII (names, emails) was collected, ensuring anonymity. Responses were stored in encrypted university servers accessible only to the investigators. Participants were assured that withdrawal could occur at any time without penalty.

Given the mental‑health focus, we provided a list of university counseling services at the end of the survey for participants who felt distressed. All procedures adhered to the Philippine National Ethical Guidelines for non‑clinical social‑science research involving adults (Lim et al., 2022) . We minimized risk by designing the survey to avoid distressful wording and allowing participants to exit at will. Only cumulative scale scores (not raw item-level data) were shared in publications, and all results are reported in aggregated form to prevent identification of individuals. There were no incentives, reducing pressure and preserving voluntariness. Overall, protocols ensured respect, beneficence, and privacy aligned with best practices for youth online research.

CHAPTER 4 SAMPLE THESIS QUESTIONS AND ANSWERS

1. What are the key findings in Chapter 4?

Sample Answer
In Chapter 4, descriptive statistics revealed an average digital literacy score of 4.11 (SD = 0.74) and a mean perceived stress score of 25.3 (SD = 5.8), indicating moderate stress levels. Pearson’s correlation showed a strong, statistically significant inverse association between digital literacy and stress (r = –.43, p < .01, 95% CI = [–.52, –.32]) (Giray et al., 2025). A multiple regression controlling for gender, age, and internet‑access tier explained 18% of variance in stress (R² = .18, F(4, 395) = 21.7, p < .001), with digital literacy emerging as a significant negative predictor (β = –.38, t = –7.2, p < .001). Subgroup analysis further showed stronger effects in the low‑income group (β = –.45, p < .001) compared to those with stable broadband access (β = –.31, p = .01), suggesting socioeconomic buffering. Although Filipino Gen Z displayed robust digital competency, those with very high usage (over ~9 hours/day) began to lose stress protection, supporting a consumption threshold model. This cluster pattern reinforces that higher digital literacy correlates with lower stress, yet situational and access factors moderate that relationship. These quantitative outcomes will anchor the discussion in Chapter 5, emphasizing that literacy alone is insufficient without establishing healthy usage practices.


2. Which result surprised you most, and why?

Sample Answer:
The most surprising finding was the inverted relationship detected between literacy and stress at high usage thresholds. Although we anticipated that higher digital literacy would always protect against stress, moderation analysis exposed a significant positive interaction when daily device usage exceeded approximately 10 hours (β = +.12, p = .04). In practical terms, at extreme usage levels digital competency no longer conferred stress relief—in fact, it appeared to slightly increase stress. This reflects a ceiling effect: digital skill cannot compensate for over‑exposure. The pattern mirrors the double‑edged impact of Internet use noted among Filipino university students (Abad Santos et al., 2023).

Another unexpected twist was the weak buffering role of peer and family support. Existing literature and hypotheses (e.g., Saleem et al., 2024) emphasize the importance of informal social support in reducing technostress, yet our study found that informal support networks were not statistically significant moderators (p > .10), whereas formal institutional support remained effective (p < .01).

Together, these findings prompted a re‑examination of theoretical assumptions: high literacy and social skill are not sufficient unless accompanied by structured institutional interventions and controlled use patterns. These unexpected patterns shaped the discussion and recommendations chapters.


3. How do your findings relate to your thesis research questions?

Sample Answer:
Research Question 1 asked whether there is a relationship between digital literacy and stress among Gen Z respondents. The data clearly answered: yes—higher digital literacy significantly predicts lower stress (r = –.43, β = –.38, p < .001) (Giray et al., 2025). Research Question 2 examined whether institutional support (e.g., school or university frameworks) moderates this relationship. Analysis using Hayes’ PROCESS (model 1) revealed a significant moderating effect (β of interaction term = +.11, p = .009), indicating that formal supports strengthen the protective literacy–stress link.

Research Question 3 addressed differences across socioeconomic and usage tiers. Stratified regression confirmed that the literacy–stress slope was steeper in low‑income groups (β = –.45) relative to participants with steady broadband access (β = –.31), validating the role of economic barriers in digital resilience.

We also explored whether peer or familial support averts stress; findings show no significant moderation effects for the informal network variables (all ps > .10), even though these variables were significantly correlated with stress in simple correlations.

Finally, Research Question 5 tackled whether usage thresholds matter: Johnson–Neyman analysis indicated the literacy‑stress link ceases to be significant above the 84th percentile of usage (~10 hr/day), offering substantive validation of the usage threshold model. Therefore, each research question is systematically addressed, with findings generally supporting hypotheses except for the informal support components.


4. What patterns or themes emerged from your thesis data?

Sample Answer
A clear clustering pattern emerged when digital literacy and usage data were grouped via k‑means analysis. Cluster A(n = 85) comprised high‑literacy respondents (M = 4.5) with moderate usage (M = 6 hours/day), exhibiting the lowest stress levels (M = 18) and consistently higher institutional support scores. In contrast, Cluster B (n = 76) included individuals with moderate literacy (M = 3.8) but excessive usage (>10 hours/day), and they reported the highest stress scores (M = 31), despite having marginally better peer/family support. This typology affirms the technostress overloadmodel, whereby high device exposure undercuts literacy’s protective value (Tarafdar et al., 2007).

Additionally, gender differences within Cluster B revealed that female participants more frequently reported sleep interruptions (75% vs. 60% of males) and disruptions in school engagement. Conversely, negligible gender differences were observed in Cluster A. Another emergent pattern: time of day did not significantly alter these clusters, nor did the main effects change when analyses focused on smartphone-only users (vs. multi-device users), indicating that sheer duration rather than device type drives stress risk. These patterns informed Chapter 5’s theme of threshold-based digital resilience and nuanced support strategies for Gen Z.


5. Were any findings contradictory to existing literature?

Sample Answer:
Yes, certain results diverged meaningfully from established literature. For instance, previous research often portrays Generation Z as largely immune to stress from digital use, given their status as “digital natives” (Barrot, 2018). Our findings, however, challenge that norm: we found that beyond a threshold of ~9–10 hours daily, digital literacy no longer protects against stress, and may even exacerbate it. This nuance undermines the blanket notion of effortless digital resilience.

Likewise, the double‑edged nature of Internet use demonstrated in Abad Santos et al. (2023) indicates that while online social support mediates positive mental health effects, excessive use still yields direct negative outcomes. In our data, although institutional social support buffered stress effectively, peer/family support did not significantly mediate as prior studies suggest, revealing a notable departure in the Philippine Gen Z context (Saleem et al., 2024).

Further inconsistency emerged in subgroup gender analysis: some previous studies emphasize stronger peer buffering among females, but our analysis found no significant gender interaction in support variables, suggesting socio‑cultural differences in help‑seeking behavior. These qualitative deviations from prior expectations prompted a deeper literature review and are explored in greater depth in Chapter 5.


6. How did you conduct the data analysis in your thesis?

Sample Answer:
The analytic workflow was structured as follows:

  • Software: SPSS 27 and Hayes’ PROCESS macro v4.1 for moderation and mediation analysis.
  • Reliability & validity: Cronbach’s α for digital literacy (.89) and stress (.87); composite reliability > .85; exploratory factor analysis confirmed item loadings > .60; Harman’s single-factor test showed one factor explained 34% of variance—below the 50% benchmark, which minimizes concerns about common-method bias.
  • Descriptive and correlational: Means, standard deviations, Pearson correlations (two-tailed), and normality checks (Kolmogorov–Smirnov p > .05).
  • Regression & moderation: Independent variables were mean‑centered. PROCESS Model 1 tested interaction terms (digital literacy × institutional support) with bootstrapped 95% CIs (5,000 samples). Johnson–Neyman output identified regions of significance based on usage distribution.
  • Cluster analysis: Used k‑means clustering (k = 2) to identify usage–literacy typologies; Levene’s and post‑hoc t‑tests confirmed cluster distinctions.
  • Sensitivity checks: Stratified regression by gender and access tier; personnel categories were tested via interaction with control variables. The methods mirror technostress and moderated regression standards in education research (Saleem et al., 2024)

7. What evidence supports the validity and reliability of your measures?

Sample Answer:
Chapter 4 presents rigorous validation of the instruments:

  • Internal consistency: Both the digital literacy and perceived stress scales achieved Cronbach’s α > .87 and composite reliability > .85.
  • Construct validity: Exploratory factor analysis confirmed that each item loaded above .60 on its respective factor, with no cross-loadings. Harman’s single-factor test showed that the first factor captured only 34% of variance, below the threshold that suggests common errormethod bias.
  • Criterion validity: Significant correlations were found between digital literacy and stress, as well as support variables, consistent with theoretical expectations.
  • Moderation validation: The PROCESS bootstrapping method produced interaction terms whose confidence intervals remained entirely above or below zero, indicating robustness. Adjusted R² change (≈.015) was statistically significant.
  • Bootstrapped J–N regions confirmed differential effects above and below usage thresholds.
  • Cluster validation: Follow‑up chi-square tests for demographic distribution (e.g., gender and income tier) across clusters revealed no significant bias (p > .10), supporting the internal homogeneity of clusters.

These techniques align with best practice in quantitive social science research and mirror frameworks used in technostress studies (Saleem et al., 2024), thereby ensuring confidence in the results’ reliability and validity.


8. What practical implications do your results suggest?

Sample Answer:
The findings suggest three practical interventions:

  1. Revised digital literacy training: Curricula should go beyond technical proficiency to include digital well‑being, emphasizing self‑regulation and awareness of usage thresholds—especially since literature shows that literacy alone does not prevent stress beyond ~9 hours of use daily.
  2. Institutional support frameworks: Educational institutions (senior high schools, colleges, and universities) should develop formal support mechanisms—e.g. onboarding orientations on healthy device habits, digital wellness workshops, and real‑time technical support desks—because institutional support significantly moderated stress in our findings (Saleem et al., 2024)
  3. Active peer mentoring and structured community groups: While online peer or family support showed limited buffering in our study, frameworks that channel this support into formal, guided peer‑mentorship channels or teacher‑led online communities may better harness the positive effects of online social support, consistent with the pathways identified by Abad Santos et al. (2023).

These implications recommend that policy makers, school administrators, and mental health practitioners implement usage‑aware design, combine formal support structures, and encourage structured peer mentoring to build resilience in Filipino Gen Z digital users.

CHAPTER 5 SAMPLE THESIS QUESTIONS AND ANSWERS

1. What is the overall summary of your thesis and key findings?

Sample Answer:
This study investigated the interplay between digital literacy and perceived stress among Filipino Gen Z university students. Descriptive results revealed elevated digital literacy (M = 4.1/5, SD = 0.7) yet moderate stress levels (PSS ≈ 25/40), suggesting a dual reality of capability and strain. Pearson’s correlation indicated a significant inverse relationship between digital literacy and technostress (r = –.43, p < .01), supporting the hypothesis that digital competence predicts lower perceived stress (Giray, 2024).

Multiple regression, controlling for income, access type, and discipline, accounted for 18% of stress variance (R² = .18, F(4,395) = 21.7, p < .001), with digital literacy as a robust negative predictor (β = –.38, p < .001). Interaction analysis (Hayes’ Model 1) revealed that high literacy can buffer stress, but only when daily device use is below approximately 9 hours; above this threshold, the protective effect plateaued, and high usage began to correlate positively with stress (Kumar, 2024).

Cluster analysis (k = 2) created two distinct typologies: Cluster A with high literacy and moderate use (low stress), and Cluster B with moderate literacy but excessive usage (>10 hrs/day), which had the highest stress scores. These findings systematically address all research questions, demonstrating not only that literacy is beneficial but also that screen-time boundaries are critical for well-being. These insights form the empirical backbone for the recommendations and conclusions that follow.


2. What are the theoretical and practical conclusions of your thesis?

Sample Answer:
Theoretically, this study strengthens a hybrid framework that integrates Ng’s digital literacy model with Kumar’s technostress theory, showing that digital literacy subcomponents—especially cognitive and ethical communication skills—act as protective factors against negative emotional outcomes (Addai, 2024). These findings corroborate prior global research and extend them to the Philippine Gen Z context, highlighting nuances such as usage thresholds and the limited role of informal peer support.

Practically, the study offers several actionable conclusions:

  1. Curricula must elevate digital literacy beyond access to include critical evaluation, responsible communication, and online resilience training.
  2. Well-being interventions should incorporate digital wellness education, emphasizing safe usage limits and time‑management practices.
  3. Long screen times (>9 hrs/day) negate the benefits of literacy—a reminder that healthy device habits are as key as competent use.

These results suggest that policymakers, universities, and counselors should not only promote digital skills but also embed usage‑aware wellness programs. The evidence clearly indicates that digital literacy is necessary but not sufficient for emotional resilience—usage habits and institutional supports play pivotal roles in students’ stress regulation.


3. What are your educational policy implications?

Sample Answer:
The study yields several direct implications for Philippine education policy at both secondary and tertiary levels:

  1. DepEd and CHED curricula should mandate digital literacy modules focusing on critical-source evaluation, content creation ethics, and mindful online habits. Evidence shows that increased literacy reliably lowers technostress (Giray, 2024).
  2. Policy frameworks should address screen-time pacing—for example, mandating no-screen zones and designated device‑free hours during school days, drawing from digital wellness best practices that improve sleep and reduce stress (behavioral health studies).
  3. Resource allocation is needed for digital equity programs, especially in under-resourced schools and provinces. Socioeconomic factors moderated stress outcomes in the study—students in low-income groupings with low literacy scores had disproportionately high stress (Kumar, 2024).
  4. Teacher training and support programs must be scaled to help educators model healthy digital habits and guide students in self-regulated digital use, as literacy-based instruction alone may not suffice.

Overall, the study supports a multi-tier policy approach that combines skill-building (literacy), behavioral guidance (usage habits), and structural support (access equity), thereby empowering Filipino Gen Z to be digitally literate and mentally resilient.


4. What recommendations do your thesis proposes for universities and educators?

Sample Answer:
For universities and educators, the following recommendations are grounded in the study’s findings:

  1. Implement formal digital wellness programs such as orientation modules that teach time‑management, platform hygiene, and healthy sleep routines. Studies show these institutional supports significantly moderate technostress (Saleem, 2024).
  2. Institute ‘digital detox hours’ or screen‑free periods during academic days. Research indicates that brief, structured detox (e.g., 15–30 minutes daily) improves focus, reduces anxiety, and enhances mood (Anandpara, 2024).
  3. Train faculty to deliver feedback through multiple communication channels (email, LMS, in-person) and to provide clear timelines, reducing cognitive load and ambiguity that contribute to technostress (Saleem, 2024).
  4. Establish peer‑led self‑regulation mentoring groups, where upper-level students guide new entrants in healthy device habits and digital workflows. Although informal support wasn’t significant in the study, structured peer mentoring better preserves positive outcomes.
  5. Monitor screen‑time data from institutional apps or LMS to track usage patterns and target students who exceed stress‑threshold limits. Personalized messages or interventions can then be sent.

Taken together, these measures ensure that literacy training is complemented by usage‑aware support—effectively translating competence into lasting digital resilience.


5. What recommendations do you suggest for students themselves?

Sample Answer:
Students are the primary agents of change—here’s how they can apply the study’s insights:

  1. Adopt intentional device use habits such as implementing nightly “digital detoxes” that avoid screen exposure at least one hour before bedtime. Studies confirm that reducing late-night screen time improves sleep quality and lowers anxiety (verywellmind, 2025).
  2. Engage in self‑regulated learning practices, including goal setting, time blocking, strategic platform-switching, and regular reflection. Research demonstrates that higher digital literacy enhances these SRL behaviors, which in turn reduces academic stress (Chen, 2025).
  3. Limit multitasking on multiple apps, particularly during study sessions—monotasking boosts focus and mental clarity (digital wellness insights).
  4. Join or form peer study groups focused on digital well-being, leveraging social accountability to regulate usage habits—structured groups have shown more impact than informal support alone (positive psychology principles).
  5. Track personal screen-use patterns using built‑in device observability tools, identifying triggers for prolonged usage—awareness leads to behavior change.

Combining digital literacy skills with self-regulation and offline checks, students can maximize the benefits of technology while maintaining emotional balance.


6. What should mental health and support services do?

Sample Answer:
Mental health and counseling services have a key role to play by integrating digital well-being into their programs:

  • Develop culturally adapted digital mental health literacy campaigns, similar to the pilot “Tara, Usap Tayo!” initiative, to educate youth about technostress, coping strategies, and help-seeking behaviors (Martinez, 2022).
  • Offer online and hybrid counseling options. While digital-first support platforms can be effective, their impact depends on structured follow-through—a study found limited empirical evidence for unstructured tools in boosting mental health outcomes (Inside Higher Ed report).
  • Establish peer-support networks, guided by supervision, to create safe spaces for confiding about device-related stress—peer mentoring models reduce isolation and encourage proactive behavior (peer support research).
  • Coordinate with student wellness programs to provide periodic “technology check-up” workshops—where students review their screen-time trends and deliberate coping plans, as recommended by the National Academy of Science’s campus well-being frameworks (SU Committee on Mental Health, 2021).
  • Embed psychometric screening for high-risk users, such as those reporting >9 hrs of daily device use or high stress scores, and refer them to targeted strategies including digital boundaries, sleep hygiene, and mindfulness.

These steps mainstream digital mental health into campus wellness infrastructure, bridging literacy skills with emotional support and behavioral health expertise.


7. What are the limitations of your study, and how can future research address them?

Sample Answer:
Several limitations shape the interpretation of this study and guide future directions:

  1. Cross‑sectional design restricts causal inference. While associations between literacy and stress are clear, longitudinal or experimental designs (e.g. pre‑post intervention studies) are needed to ascertain directional effects.
  2. Sampling bias arises from convenience sampling at three Metro Manila universities, which may not represent Gen Z populations in rural areas or islands. Future studies should use stratified probability sampling across Philippine regions.
  3. Self‑report measures may be affected by social desirability or recall bias. Future work could include objective logs (e.g. screen‑time reports from phones) and physiological stress markers (e.g. heart‑rate variability).
  4. Platform evolution as a moving target: social media and app norms change rapidly, so findings from 2025 may not hold beyond 2027. Future research should consider periodic replication.
  5. Limited scope of stress‑buffering variables: informal peer/family support was not significant, contrary to older literature. Mixed‑methods studies could explore sociocultural nuances behind support dynamics.
  6. Threshold specification: the ~9‑hour usage cutoff emerged post hoc. Future experimental studies can test whether specific screen‑time limits reliably buffer the literacy–stress link.
  7. Absence of mental health clinical data: findings apply to non‑clinical college populations only. Including students with diagnosed anxiety or depression could identify differential effects.

The future research can refine theoretical models, inform culturally tailored interventions, and expand generalizability across broader demographic contexts.


8. What is your final concluding message to the panel?

Sample Answer:
In closing, this study offers a clear and resonant message: digital literacy and digital well-being must go hand in hand. High levels of digital skill—especially in critical thinking, ethical use, and secure communication—significantly reduce perceived stress in Filipino Gen Z students. Yet, this protective effect only holds when device usage remains within moderate boundaries (ideally under 9 hours daily). In essence, lit­eracy without limits, or usage without self-awareness, leads to digital vulnerability.

Thus, empowering Gen Z requires more than teaching shortcuts and platform hacks; it demands instilling a balanced approach to technology—where self-awareness, mindful usage, and institutional support reinforce each other. Whether through improved curricula, wellness policies, or personal routines, the pathway to digital resilience is paved by merging competence with care.

By mobilizing these insights, universities, policymakers, counselors, and students themselves can create healthier digital ecosystems—ones that harness technology’s strengths without sacrificing mental well-being or academic vitality. In a world increasingly dependent on screens, digital empowerment is not just a skill—it is a safeguard.

Thank you, panel—it has been a privilege to present these findings and contribute to the well-being and literacy of Gen Z inthe Philippines.

If you need help with your research editing, proofreading, and data analysis, you can email us. Follow our Facebook for discount codes.

thesis defense

You May Also Like…

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *