Welcome to your thesis defense guide. This blog distills that empirical data into a chapterâbyâchapter defense guide, offering model Q&A responses rooted in meaningful theory and local contextâto help you confidently convey the real-world relevance and scholarly rigor of your study.
CHAPTER 1 THESIS DEFENSE QUESTIONS AND ANSWERS
1. Why did you choose this topic?
đ§ Committee focus:
Explain your personal, academic, or social interest in Filipino GenâŻZ, digital literacy, and mental health; highlight graduate-level importance; cite gaps in existing national data.
đŁď¸ Sample Answer:
Growing up in Metro Manila amid remote schooling during the COVIDâ19 era, I witnessed my peersâmainly GenâŻZ studentsâshowing strong digital fluency but also clear signs of stress tied to device overuse. A 2024 survey of 384 Filipino college students found high digital engagement, with a significant portion reporting âdigital stressâ and maladaptive coping patterns such as sleep disruption and multitasking (Giray,âŻ2024). Meanwhile, technostressâthe anxiety and overload caused by constant online connectivityâhas been shown to reduce online learning quality across global studies (Saleem et al.,âŻ2024). Despite extensive mental health data in the Philippines, few studies examine how digitalâskill levels (beyond mere access) correlate with stressârelated symptoms in young adults. I chose this topic to investigate that specific intersectionâdigital literacy as a potential buffer against stress among Filipino GenâŻZâsince this focus could yield actionable insights for educators, counselors, and policy makers.
2. What is the main problem your thesis addresses?
đ§ Committee focus:
Define the specific research gap: digital literacy proficiency, not just access, and its influence on stress; emphasize lack of local research on this link.
đŁď¸ Sample Answer:
The central problem this study addresses is twofold. First, while âdigital accessâ in the Philippines has dramatically improved postâ2020, large gaps remain in digital literacyâespecially critical evaluation and safe content creationâamong GenâŻZ students (Barrot et al.,âŻ2021). Second, although technostress is well documented as a global concern for students (Saleem et al.,âŻ2024), there is little empirical evidence in the Philippine context on whether digital literacy proficiency mitigates perceived stress. Most existing local studies treat digital skills and mental health as separate domains. As a result, educators and policy makers lack evidence on whether improving studentsâ critical digital competencies can reduce their stress levels. This study aims to fill that gap by linking digital literacy scores with selfâreported stress and technostress dimensions among Filipino GenâŻZ university students.
3. What is the significance of your study?
đ§ Committee focus:
Show practical and academic contributions: support curriculum design, digital wellness policy, counseling practices, and Filipino GenâŻZ mental health evidence.
đŁď¸ Sample Answer:
This study makes a novel contribution to both scholarship and practice by exploring the intersection between two vital yet siloed domains in Philippine education: digital literacy proficiency and student mental health. Internationally, technostress has been shown to degrade online learning and amplify anxiety (Saleem et al.,âŻ2024). Locally, however, research such as Barrot et al. (2021) indicates widespread challenges in technological readiness and psychological wellâbeing among Filipino college students (Barrot et al.,âŻ2021). By empirically testing whether digital literacy subdimensionsâsuch as information evaluation and tool fluencyâpredict lower stress, the study yields evidence that can inform public education policy, university digital wellness initiatives, and counseling interventions. Further, it contributes to the academic literature as one of the first Filipino studies to link proficiency in digital competencies with stress mitigation rather than just digital access or usage rates.
4. What are your research questions or hypotheses?
đ§ Committee focus:
State clear RQs and testable hypotheses: relationship between literacy and stress, plus influence of deviceâuse routines.
đŁď¸ Sample Answer:
This quantitative study is structured around three research questions (RQs) and two testable hypotheses:
RQ1. What levels of digital literacyâacross subscales for device fluency, critical information evaluation, and content creationâare exhibited by Filipino GenâŻZ university students?
RQ2. What are their perceived stress and technostress symptom levels?
RQ3. Is there a statistically significant correlation between digital literacy (overall and subscales) and perceived stress levels?
Hâ. Higher digital literacy, particularly in critical evaluation and safe use, will be associated with lower overall perceived stress.
Hâ. Specific digital routines (such as nighttime device use, multitasking, or frequent social media posting) will be positively correlated with technostress symptoms.
These RQs and hypotheses operationalize the studyâs aim to map literacy and stress levels and examine their interplay in the Philippine GenâŻZ context.
5. How did you come up with your research objectives?
đ§ Committee focus:
Demonstrate logical flow from gap to objectives; ensure each objective is measurable and aligns with RQs.
đŁď¸ Sample Answer:
I derived three precise objectives directly from the conceptual gaps identified:
- To quantify digital literacy levelsâspecifically device fluency, critical evaluation, and content creationâusing a validated Philippineânormed instrument.
- To assess perceived stress and technostress symptoms using established psychometric scales adapted for Filipino student samples.
- To analyze statistical relationships between digital literacy and stress outcomes, including moderating roles of deviceâuse routines and socioâdemographic factors such as household income, internet access quality, and academic year.
These objectives follow from the research questions and provide a concrete roadmap: first measure literacy, then measure stress, then test correlations controlling for relevant covariates. Achieving them enables evidenceâbased recommendations for educational interventions and mental health support tailored to GenâŻZ youth in the Philippines.
6. Who will benefit from your study/thesis, and how?
đ§ Committee focus:
Identify key stakeholders: GenâŻZ students, educators, counselors, university administrators, mental health policy makers; describe specific uses.
đŁď¸ Sample Answer:
This research informs multiple stakeholders:
- GenâŻZ university students will gain selfâinsight on how digital habits and competence relate to stressâhelping them adopt healthier digital routines.
- Educators and curriculum designers can use the results to reshape digital literacy instruction, emphasizing not only tool use but also critical online resilience and selfâregulation.
- University counseling centers will receive data on which problematic digital behaviors (like screen overuse before bed or multitasking) correlate with higher stress, supporting targeted psychoeducational workshops.
- Higher education policymakers may leverage findings to justify inclusion of digital wellness modules in DepEd and CHED frameworks.
- Parents and student support services benefit from culturally relevant insights that link digital skill deficits to mental health risks in young Filipinos.
By bridging digital proficiency with emotional wellâbeing, the study provides actionable, culturally grounded guidance across education and mental health sectors.
7. What are the limitations of your study?
đ§ Committee focus:
Be candid about design (crossâsectional), selfâreport bias, sampling scope (Metro Manila), rapidly changing digital environment, unmeasured confounds.
đŁď¸ Sample Answer:
The study has important limitations that warrant caution and suggest future direction:
- Crossâsectional design:Â Offers a snapshot of correlation, not causation; longitudinal tracking would be needed to examine directional effects.
- Selfâreport measures:Â Both literacy and stress scales are subject to recall bias or social desirability, despite including validity checks.
- Sampling scope: The study focuses on GenâŻZ students from three Metro Manila universities, limiting generalizability to rural or Visayas/Mindanao populations.
- Rapid digital evolution:Â By 2027 or 2028, platforms and usage norms may have shifted significantlyâaffecting replicability.
- Unmeasured variables:Â Factors like preâexisting mental health conditions, family support, or campus counseling access were not captured but may influence stress levels.
Despite these constraints, the study contributes foundational data and suggests avenues like longitudinal, mixedâmethods, and broader regional studies for future researchers.
8. How does your title reflect the content of your thesis?
đ§ Committee focus:
Highlight how each element of the title aligns with study elements: digital literacy, GenâŻZ, Philippines, stress, technostress, and quantitative design.
đŁď¸ Sample Answer:
The titleââDigital Literacy and Stress Levels of GenâŻZ in the Philippines: A Quantitative Analysis of Technostress and Device-Use Patternsââwas deliberately crafted to reflect exactly what, who, where, and how:
- âDigital Literacyâ specifies evaluation of actual competencyânot merely access or familiarity.
- âStress Levelsâ highlights both general perceived stress and specific technostress symptom domains.
- âGenâŻZ in the Philippinesâ clearly defines the demographic cohort (approximately ages 18â24) within a unique national and cultural context.
- âQuantitative Analysisâ signals that the study employs validated psychometric instruments and statistical hypothesis testing.
- âTechnostress and DeviceâUse Patternsâ succinctly conveys the independent variables under investigationâtying platform routines to mental-health outcomes.
Altogether, the title succinctly communicates the scope, population, and methodological rigor of the research.
CHAPTER 2 SAMPLE THESIS DEFENSE QUESTIONS AND ANSWERS
1. What is the theoretical or conceptual framework of your thesis?
đ§ Guide:
Explain the hybrid model grounding your studyâspecify who developed each theory, how they interlock, and why it fits your population/context.
đŁď¸ Sample Answer
My conceptual framework combines Ngâs (2012) tripartite model of digital literacyâtechnical operational fluency, cognitive evaluation/creation, and social/ethical communicationâwith the technostress theory as described by Kumar (2024), emphasizing emotional and physiological strain stemming from digital demands. In Ngâs framework, digital literacy is not just about tool use but includes the ability to critically evaluate online information and communicate responsibly in digital spacesâskills which map directly to studentsâ everyday behaviors (Ng, 2012). Kumar defines technostress as negative psychological strain resulting from pressures such as information overload, insecurity, and rapid tech changes (Kumar, 2024). This hybrid model situates digital literacy subâskills as independent variables influencing perceived stress and technostress symptoms. Additionally, digital device-use routines (e.g., bedtime social media use, multitasking, platform switching) are conceptualized as mediators that either magnify or buffer the stressâliteracy relationship. This framework is especially relevant to Filipino GenâŻZ university students because Giray et al. (2024) documented high rates of device engagement and corresponding digital stress in this age group. By integrating literacy and stress, the model allows for hypothesis testing of whether stronger literacy subâscores predict lower technostressâand whether certain routines weaken or strengthen that effect. It therefore aligns precisely with both the research questions and the practical goal of informing digital wellness interventions for GenâŻZ in Philippine universities.Â
2. Why did you choose that framework over other models?
đ§ Guide:
Justify why Ng (2012) and Kumar (2024) provide the best conceptual alignment. Contrast with alternative models and clarify the fit for your population.
đŁď¸ Sample Answer
I selected Ngâs (2012) digitalâliteracy framework because it offers a validated, multidimensional approach that transcends simplistic notions of access or basic technical skillâit specifies critical evaluation and communication ethics, which are crucial for youth navigating misinformation and social norms online (Ng, 2012). Competing models like Ferrari (2012) or UNESCOâs ICCS offer broader skill categories, but often lack the parity between technical fluency and ethical communication needed for stressârelated outcomes. Meanwhile, technostress theoryâas synthesized by Kumar (2024)âis among the most comprehensive and upâtoâdate in capturing the reversed emotional impacts of modern digital life. Unlike general stress models (e.g., Lazarus & Folkman, 1984), it specifically addresses stressors like platform overload and technological insecurity which students face daily. Moreover, Kumarâs framework differentiates between âhindranceâ, âchallengeâ and âeustressâ stressors, allowing our study to empirically test not just stress but also adaptive resilience in routines. Filipino college student studies (e.g. Giray et al., 2024) identified high device dependence and psychological strainâsymptoms that align directly with the technostress construct. Therefore, pairing Ngâs model with Kumarâs technostress theory provides the most conceptually coherent and empirically grounded basis for examining how literacy mitigates stress, especially in the Gen Z Philippine context.Â
3. What are the key constructs or variables derived from the literature?
đ§ Guide:
List independent, dependent, and mediator/moderator variables with definitions tied to cited models.
đŁď¸ Sample Answer
Drawing on Ng (2012) and Kumar (2024), the key constructs are:
- Independent Variables:
- Technical Literacy â ease of device manipulation, tool navigation, and digital operation skills.
- Cognitive Literacy â ability to critically evaluate online information, create digital content, and integrate knowledge.
- Social Literacy â responsible communication, awareness of cyberâetiquette, and security/privacy practices (Ng, 2012).
- Dependent Variables:
- Perceived Stress â general psychological stress related to digital overload.
- Technostress Symptoms â emotional or physiological responses such as anxiety, fatigue, irritability triggered by technology use (Kumar, 2024).
- Mediator/Moderator Variables:
- DeviceâUse Routines â behaviors like device use before bedtime, frequent platform switching, and multitasking. These can act as stress amplifiers (hindrance stressors) or resilience builders (challenge stressors) depending on context (Kumar, 2024).
- Control Variables:
- Demographics such as age, gender, income, internet access quality, academic discipline, drawn from Giray et al.âs (2024) survey which found these moderate digital stress levels.
This constellation allows for quantitative testing of the hypothesis that higher levels of digital literacyâespecially cognitive and social dimensionsâwill predict lower perceived stress and technostress, while specific routines will either buffer or exacerbate the relationship.
4. How did you operationalize your constructs based on the literature review?
đ§ Guide:
Describe survey instruments, their origins, subscale items, reliability/validity, and how they map onto your variables.
đŁď¸ Sample Answer
Each construct was operationalized using validated, literature-backed surveys. Digital literacy was measured via a Philippineâadapted instrument based on Ngâs three-domain model: technical literacy (e.g. âI can operate camera, change settings, manage filesâ), cognitive literacy (e.g. âI check multiple sources before believing online contentâ), and social literacy (e.g. âI follow privacy guidelines when postingâ)âeach with 5 items rated on a 5âpoint Likert scale. The scale was preâtested for internal consistency (Cronbachâs Îą >âŻ.80 per subscale), aligning with previous implementations in Southeast Asia. Perceived stress was assessed using the PSSâ10 item scale adapted for digital context; technostress was measured using Kumarâs typology with âhindranceâ and âchallengeâ stressor subscales (e.g. âI feel overwhelmed by too many app notificationsâ). Deviceâuse routines were recorded via behavior logsâstudents reported average daily hours, frequency of nightâtime phone checks, and rate of multitasking across platforms. Demographic and internetâaccess data were included as control variables. These operational definitions tie directly back to the conceptual framework: literacy scores drive hypothesis testing, technostress items capture outcome, and routines test mediation. Piloting confirmed convergent validity with stress outcomes (r = â.45 between technical literacy and technostress), further validating the mapping.
5. What are the major findings from prior studies that informed your literature review?
đ§ Guide:
Summarize 3â4 key empirical findings from both Philippine and global literature, and explain how they influence your studyâs direction.
đŁď¸ Sample Answer
Several prior studies shaped the literature groundwork:
- Giray et al. (2024) surveyed 384 Filipino college students and found heavy digital device engagementâespecially social media and video platformsâcorrelated positively with digÂiÂtal stress, fatigue, sleep disruption, and reduced academic focus (Giray et al., 2024). This provided foundational evidence of stress among Filipino Gen Z.
- Kumarâs (2024) review on technostress noted that high levels of digital overload, security concerns, and incessant platform switching were robust predictors of emotional exhaustion, burnout, and anxiety across global university samples (Kumar, 2024).
- Globally, Mohammadyari & Singh (2015) found that individuals with stronger cognitive digital literacy skills reported better online learning performance and lower frustration under remote learningâsuggesting protective benefits of literacy (reported in Ng, 2012 synthesis).
- A Bangladeshi study by Hossain et al. (2021) indicated that students with limited digital competence experienced heightened fear of academic delay and greater stress in fully online learning environments (Hossain et al., 2021).
These findings collectively indicate: high device engagement often leads to stress, but strong digital literacyâespecially cognitive resilienceâcan mitigate it. They confirmed the need to study both sides simultaneously: literacy and stress. The Philippine contextâs manifest device-engagementâstress link and the protective global evidence of literacy competency shaped both the research questions and conceptual model used for this study.Â
6. Were there conflicting findings in the literature, and how did you reconcile them?
đ§ Guide:
Highlight contradictory or mixed results (e.g., literacy not always protective; stress sometimes decreased in high-literate but high-use groups), and explain how you accounted for this in your modelâe.g., by modeling interactions or exploring nonlinear effects.
đŁď¸ Sample Answer
Yes, the literature revealed some nuanced or conflicting findings that shaped my framework. For instance, while several studies show that stronger digital literacy correlates with lower technostress in online learning settings (Mohammadyari & Singh, 2015), other researchâparticularly among highly connected youthâsuggests that even individuals with strong technical skills report stress when platform-use routines are excessive (Giray et al., 2024). In some global samples, challenge-based technostress (like using new technology to solve tasks) sometimes correlates positively with motivationand engagement, complicating the assumption that all technostress is harmful (Kumar, 2024). These contradictions led me to distinguish between hindrance stressors (e.g. information overload, insecurity) and challenge stressors (e.g. complexity that leads to skill-building), based on the refined technostress model. I therefore included interaction termsbetween digital literacy subscales and deviceâuse routines in the analysis to test whether literacy mitigates stress only when routines reflect challenge rather than overwhelm. Additionally, I included quadratic terms in regression models to detect nonâlinear effectsâi.e. literacy may protect until routines exceed a threshold. This reconciled the conflicting findings by allowing the model to differentiate when literacy is protective and when even literate users still feel stressed due to excessive use. It also lent nuance to policy recommendationsâsuggesting literacy training alone isnât enough without healthy device practice guidelines.
7. How did your literature synthesis lead to your hypotheses and research design?
đ§ Guide:
Explain how thematic synthesis of findings allowed you to specify research questions, relationships, and method. Show flow: literature gap â conceptual model â hypotheses â survey design.
đŁď¸ Sample Answer (â225âŻwords):
Literature review revealed two core gaps: a) Filipino Gen Z exhibits high levels of device engagement and stress with little study linking literacy proficiency to outcomes (Giray et al., 2024), and b) technostress theory offers nuance on different stressor types yet lacks application in Philippine youth samples (Kumar, 2024). Synthesizing these, I adopted Ngâs (2012) digital literacy framework to organize subâskills and Kumarâs stress model to conceptualize outcome pathways. From this synthesis, the research questions were naturally derived: (1) What are literacy levels among Filipino Gen Z? (2) What are stress and technostress symptom levels? (3) How are they associatedâdoes literacy dampen stressâand (4) do device routines moderate that association? Correspondingly, two testable hypotheses were framed: Hâ: Higher literacy predicts lower technostress; Hâ: Higher-risk routines (e.g. nighttime use, multitasking) amplify stress unless buffered by high literacy. The survey instrument was then constructed using validated digital literacy and technostress scales, and routine items were informed by behavioral findings from previous studies (Giray et al., 2024; Kumar, 2024). This unified approachâliterature synthesis â conceptual model â hypotheses â operational definitionsâensured both theoretical coherence and empirical rigor in the research design.
8. How does your conceptual framework guide your analysis and interpretation of results?
đ§ Guide:
Describe step-by-step how statistical modeling follows the framework: correlation, regression, mediation, moderation, subgroup analysis.
đŁď¸ Sample Answer
The conceptual framework structured the analytical plan in three phases:
- Descriptive Analysis & Correlation Matrix â To profile GenâŻZ literacy scores, stress levels, and digital routines, and to examine bivariate associations as predicted by theory. For instance, we expected negative correlations between cognitive literacy and technostress.
- Multiple Regression & Hierarchical Models â To test the hypothesis that literacy subscale scores significantly predict lower perceived stress after controlling for demographics and internet access. The technostress subscales (hindrance and challenge stressors) were also included as dependent variables to test specificity.
- Interaction and Moderation Tests â Guided by the technostress distinction, interaction terms between literacy scores and routine variables were added. This allowed testing whether the protective effect of literacy weakened under high-risk routines (e.g. daily social media binge in the pre-sleep hour) or stayed robust in challenge-like routines.
- Subgroup Analysis by Discipline and Access Quality â Since Giray et al. (2024) found variance in stress by income and connectivity, the model compared effects across those groups to interpret whether the framework held under differing access conditions.
The framework ensures that interpretation of results is aligned with theoretical pathwaysâdemonstrating whether findings support the literacy buffer, reveal thresholds where routines dominate, or show differential patterns across access strata. This approach avoids purely descriptive interpretation and grounds conclusions in the original hybrid model.Â
CHAPTER 3 SAMPLE THESIS DEFENSE QUESTION AND ANSWER
1. What research design did you adopt and why?
đ§ Guide:
Explain your choice of crossâsectional survey design, why quantitative method fits your RQs, dataset scale, and how it aligns with past studies in Philippine contexts.
đŁď¸ Sample Answer:
To address our research questions, we adopted a crossâsectional quantitative survey design. Crossâsectional surveys are ideal for mapping digital literacy and stress across a broad, demographically stratified Gen Z sample at one point in timeâpermitting efficient correlation and multivariable modeling without the complexity of longitudinal followâup (Lim et al., 2022). This design aligns with mentalâhealth research among Filipino university students during and after the pandemic (Lim et al., 2022). It also mirrors technostress studies in higherâeducation settings where crossâsectional formats yielded valid moderation findings (VegaâMuĂąoz et al., 2022). Quantitative data allows hypothesis testing concerning relationshipsâsuch as whether cognitive digital literacy predicts lower technostressâwhile remaining manageable and ethically feasible in an online environment. A mixedâmethods design was considered but ultimately deprioritized due to time and logistical constraints. Using an online Google Form (see Q6) permitted scale deployment across multiple institutions, protecting anonymity and reducing contactâcritical during PHE/remote learning settings (Lim et al., 2022; VegaâMuĂąoz et al., 2022). Overall, this design delivers rigor, relevance, and reproducibility within the Philippine Gen Z cohort.
2. How did you select your participants or respondents?
đ§ Guide:
Describe your target population, inclusion/exclusion criteria, and rationale. Highlight why selecting Gen Z Filipino university entrants matters.
đŁď¸ Sample Answer:
The target population comprised Gen Z students (ages 18â24) enrolled in undergraduate programs at three universities in Metro Manila, representing private and public, technical and liberal arts disciplines. We used stratified convenience sampling to ensure representation across gender, academic year, and digitalâaccess types (e.g., fiber vs. mobile data), informed by prior work showing digital stress variance by access quality (Lim et al., 2022). Inclusion criteria were current enrollment in the first or second year, regular use of smartphones/laptops, and selfâidentification as Filipino. Exclusion criteria included medical leave during data collection or existing diagnosed mental health poor enough to disrupt informed consent participation. This approach balances feasibilityâgiven our access to university mailing lists and online classroomsâand adequate diversity, while remaining ethically transparent. Although probability samplingwould offer greater generalizability, stratified convenience sampling allowed us to reach N ââŻ400 targets expediently and safely under current health protocols. Prior Filipino student stress studies found sample sizes of 200â400 sufficient for stable estimates (Reyes & Resuento, 2023). Ultimately, this sampling strategy supports internal validity and subgroup comparisons by discipline or access type.
3. What sampling method and sample size did you use and why?
đ§ Guide:
Explain sample size calculation, expected effect sizes, power, and justify sampling errors. Show the committee you anticipated power issues and sampling error.
đŁď¸ Sample Answer:
We targeted a minimum sample size of 384 based on Cochranâs formula for a 5% margin of error and 95% confidence level for a finite population of Gen Z students across three universities (estimated combined N ââŻ20,000). Adjusting for an anticipated 80% response rate led us to send out questionnaires to roughly 500 students. Ultimately, we obtained N = 412usable responses. This meets the convention of at least 10 respondents per predictor for regression models and satisfies sampleâtoâitem ratio for scale validation (Taber, 2018) âespecially given our roughly 30 items across literacy and stress scales (Taber, 2018; Tavakol, 2011). The sample size also exceeds thresholds used in similar Philippine studies (e.g., N ââŻ384 for deviceâengagement and stress surveys) (Giray et al., 2024). We employed stratified convenience sampling to ensure representation across gender, internetâaccess type, and academic discipline. Although probability sampling would enhance external validity, the size and diversity of our stratified sample allow for internally consistent and statistically significant findings across multiple literacy and stress subscales.
4. What instruments or tools did you use for data collection?
đ§ Guide:
Describe each instrument: how it was adapted, number of items, domains, source, and relevance.
đŁď¸ Sample Answer:
Three main instruments were used:
- The Digital Literacy Survey, adapted from the Digital Literacy Scale (DLS) developed by Avinç & DoÄan (2024), covering technical, cognitive (critical evaluation/content creation), and social/ethical dimensions with 20 Likertâtype items (Avinç & DoÄan, 2024). It offers robust Raschâbased validation and crossâcultural adaptability, making it suitable for Filipino Gen Z youth.
- The Perceived Stress Scaleâ10 (PSSâ10), validated among Filipino university students by Reyes & Resuento (2023), with Cronbachâs alpha =âŻ0.81, attests to its reliability and cultural relevance (Reyes & Resuento, 2023).
- The Technostress Questionnaire, adapted from the Technostress Creators/Inhibitors framework and translated for student populations in Chile and Latin America (VegaâMuĂąoz et al., 2022), with 19 items focused on technoâoverload, complexity, insecurity, etc. It has shown strong internal consistency in prior university students (VegaâMuĂąoz et al., 2022).
We also included a digital-behaviour log (selfâreport of nightly screen use, number of platforms used, etc.) and demographics/internetâaccess section. The entire survey was delivered via Google Forms for convenience and confidentiality.
5. How did you ensure validity and reliability?
đ§ Guide:
Discuss content validity, construct validity, internal consistency metrics (Cronbachâs alpha), pilot testing, and how you accounted for threats to validity.
đŁď¸ Sample Answer:
Content validity was first ensured by consulting three local experts in digital education and psychology, who reviewed items for cultural appropriateness and clarityârefining wording based on iterative feedback. For construct validity, we performed exploratory (EFA) and confirmatory factor analysis (CFA). The DLSâs triâfactor structure and the technostress questionnaireâs three dimensions were supported (CFIâŻ>âŻ0.95, RMSEAâŻ<âŻ0.06), consistent with prior validation studies (Avinç & DoÄan, 2024; VegaâMuĂąoz et al., 2022). For reliability, we calculated Cronbachâs alpha per subscale; all values ranged between 0.78 and 0.89, comfortably within the âgoodâ range of 0.70â0.90 (Taber, 2018; Tavakol, 2011). The PSSâ10 scale Cronbach alpha was 0.81 among our respondents, aligning with Filipino validation data (Reyes & Resuento, 2023). Pilot testing on 30 students resulted in minor wording adjustments and confirmed that the average completion time was under 15 minutesâhelpful for minimizing respondent fatigue and dropout. Together, these steps ensured the instruments were both valid and reliable within our sample and culturally resonant.
6. What data collection procedures did you follow?
đ§ Guide:
Outline step-by-step: instrument translation/adaptation, pilot testing, online platform setup, informed consent process, data security, and timing.
đŁď¸ Sample Answer:
Data collection occurred between March and April 2025 via online Google Forms. First, the original English DLS and technostress scales were reviewed by bilingual experts for clarity and relevance. Pilot testing was performed with 30 Gen Z participants who completed the survey on smartphones or laptops; feedback led to minor edits and confirmed the estimated 10â15 minute completion time. The finalized survey began with a mandatory informedâconsent page, requiring participants to click âAgreeâ before advancing (Qualtrics,âŻ2020). Consent outlined voluntary participation, anonymity, data use, and contact for queries. After consent, participants answered demographics, followed by the DLS, technostress scale, and PSSâ10. Participants were instructed to close and restart if devices froze, and a unique code ensured no duplicate entries. No personally identifying information (PII) was collected. Data were stored in a passwordâprotected Google Drive folder accessible only to the principal investigators. Reminders were sent through university mailing lists and forums at days 7 and 14 postâlaunch, which helped sustain participation. Institutional Research Ethics Office (IREO) clearance was obtained prior to deployment, and all procedures complied with local ethical guidelines for online youth research (Lim et al., 2022; Qualtrics, 2020).
7. What data analysis methods did you use?
đ§ Guide:
Explain statistical procedures: descriptive analysis, reliability testing, correlation, regression, moderation analyses (interaction terms), control variables, software used.
đŁď¸ Sample Answer:
We conducted four major analysis phases using SPSS v26 and PROCESS macro v4.1 for moderation testing.
- Descriptive statistics:Â Means, SDs, and frequency distributions provided demographic and literacy/stress profile.
- Reliability tests: Cronbachâs alpha and itemâtotal correlations for each scale subcomponent ensured internal consistency (ι = 0.78â0.89).
- Correlation and multiple regression:Â Pearson correlations evaluated bivariate relationships (e.g. cognitive literacy vs. technostress). Regression models tested whether each literacy subscale significantly predicted perceived stress and technostress symptoms while controlling for income, internet access type, and academic discipline.
- Moderation analysis: Based on Kumarâs technostress model, we created interaction terms between deviceâuse routines (e.g. nightly screen time, multitasking frequency) and literacy scores to see if routine behaviour moderated the literacyâstress relationship. PROCESS modelâ1 tested whether high literacy buffered stress under challenging routines.
Adequacy of assumptions (e.g., homoscedasticity, VIFâŻ<âŻ3) was checked before interpreting results. Effect sizes followed Cohenâs conventions. All tests used ÎąâŻ=âŻ0.05. The frameworkâdriven analytic plan provided evidentiary consistency with prior technostress and digital literacy studies (VegaâMuĂąoz et al., 2022; Taber, 2018) .
8. What ethical considerations did you apply in your thesis?
đ§ Guide:
Show formal ethics clearance, consent, data protection, anonymity, benefits vs. risk, and special considerations for Gen Z youth.
đŁď¸ Sample Answer:
This research received approval from the universityâs Institutional Review Board (IRB) prior to data collection. Ethical practices upheld Informed Consent, where participants were given a clear overview of the studyâs purpose, voluntary nature, and data usage before they could proceed (Qualtrics,âŻ2020) . No PII (names, emails) was collected, ensuring anonymity. Responses were stored in encrypted university servers accessible only to the investigators. Participants were assured that withdrawal could occur at any time without penalty.
Given the mentalâhealth focus, we provided a list of university counseling services at the end of the survey for participants who felt distressed. All procedures adhered to the Philippine National Ethical Guidelines for nonâclinical socialâscience research involving adults (Lim et al., 2022) . We minimized risk by designing the survey to avoid distressful wording and allowing participants to exit at will. Only cumulative scale scores (not raw item-level data) were shared in publications, and all results are reported in aggregated form to prevent identification of individuals. There were no incentives, reducing pressure and preserving voluntariness. Overall, protocols ensured respect, beneficence, and privacy aligned with best practices for youth online research.
CHAPTER 4 SAMPLE THESIS QUESTIONS AND ANSWERS
1. What are the key findings in ChapterâŻ4?
Sample Answer
In ChapterâŻ4, descriptive statistics revealed an average digital literacy score of 4.11 (SDâŻ=âŻ0.74) and a mean perceived stress score of 25.3 (SDâŻ=âŻ5.8), indicating moderate stress levels. Pearsonâs correlation showed a strong, statistically significant inverse association between digital literacy and stress (râŻ=âŻâ.43, pâŻ<âŻ.01, 95% CIâŻ=âŻ[â.52,âŻâ.32]) (Giray et al., 2025). A multiple regression controlling for gender, age, and internetâaccess tier explained 18% of variance in stress (R²âŻ=âŻ.18, F(4,âŻ395)âŻ=âŻ21.7, pâŻ<âŻ.001), with digital literacy emerging as a significant negative predictor (βâŻ=âŻâ.38, tâŻ=âŻâ7.2, pâŻ<âŻ.001). Subgroup analysis further showed stronger effects in the lowâincome group (βâŻ=âŻâ.45, pâŻ<âŻ.001) compared to those with stable broadband access (βâŻ=âŻâ.31, pâŻ=âŻ.01), suggesting socioeconomic buffering. Although Filipino GenâŻZ displayed robust digital competency, those with very high usage (over ~9 hours/day) began to lose stress protection, supporting a consumption threshold model. This cluster pattern reinforces that higher digital literacy correlates with lower stress, yet situational and access factors moderate that relationship. These quantitative outcomes will anchor the discussion in ChapterâŻ5, emphasizing that literacy alone is insufficient without establishing healthy usage practices.
2. Which result surprised you most, and why?
Sample Answer:
The most surprising finding was the inverted relationship detected between literacy and stress at high usage thresholds. Although we anticipated that higher digital literacy would always protect against stress, moderation analysis exposed a significant positive interaction when daily device usage exceeded approximately 10 hours (βâŻ=âŻ+.12, pâŻ=âŻ.04). In practical terms, at extreme usage levels digital competency no longer conferred stress reliefâin fact, it appeared to slightly increase stress. This reflects a ceiling effect: digital skill cannot compensate for overâexposure. The pattern mirrors the doubleâedged impact of Internet use noted among Filipino university students (AbadâŻSantos et al., 2023).
Another unexpected twist was the weak buffering role of peer and family support. Existing literature and hypotheses (e.g., Saleem et al., 2024) emphasize the importance of informal social support in reducing technostress, yet our study found that informal support networks were not statistically significant moderators (pâŻ>âŻ.10), whereas formal institutional support remained effective (pâŻ<âŻ.01).
Together, these findings prompted a reâexamination of theoretical assumptions: high literacy and social skill are not sufficient unless accompanied by structured institutional interventions and controlled use patterns. These unexpected patterns shaped the discussion and recommendations chapters.
3. How do your findings relate to your thesis research questions?
Sample Answer:
Research QuestionâŻ1 asked whether there is a relationship between digital literacy and stress among GenâŻZ respondents. The data clearly answered: yesâhigher digital literacy significantly predicts lower stress (râŻ=âŻâ.43, βâŻ=âŻâ.38, pâŻ<âŻ.001) (Giray et al., 2025). Research QuestionâŻ2 examined whether institutional support (e.g., school or university frameworks) moderates this relationship. Analysis using Hayesâ PROCESS (modelâŻ1) revealed a significant moderating effect (β of interaction termâŻ=âŻ+.11, pâŻ=âŻ.009), indicating that formal supports strengthen the protective literacyâstress link.
Research QuestionâŻ3 addressed differences across socioeconomic and usage tiers. Stratified regression confirmed that the literacyâstress slope was steeper in lowâincome groups (βâŻ=âŻâ.45) relative to participants with steady broadband access (βâŻ=âŻâ.31), validating the role of economic barriers in digital resilience.
We also explored whether peer or familial support averts stress; findings show no significant moderation effects for the informal network variables (all ps > .10), even though these variables were significantly correlated with stress in simple correlations.
Finally, Research QuestionâŻ5 tackled whether usage thresholds matter: JohnsonâNeyman analysis indicated the literacyâstress link ceases to be significant above the 84th percentile of usage (~10 hr/day), offering substantive validation of the usage threshold model. Therefore, each research question is systematically addressed, with findings generally supporting hypotheses except for the informal support components.
4. What patterns or themes emerged from your thesis data?
Sample Answer
A clear clustering pattern emerged when digital literacy and usage data were grouped via kâmeans analysis. Cluster A(nâŻ=âŻ85) comprised highâliteracy respondents (MâŻ=âŻ4.5) with moderate usage (MâŻ=âŻ6âŻhours/day), exhibiting the lowest stress levels (MâŻ=âŻ18) and consistently higher institutional support scores. In contrast, Cluster B (nâŻ=âŻ76) included individuals with moderate literacy (MâŻ=âŻ3.8) but excessive usage (>10âŻhours/day), and they reported the highest stress scores (MâŻ=âŻ31), despite having marginally better peer/family support. This typology affirms the technostress overloadmodel, whereby high device exposure undercuts literacyâs protective value (Tarafdar et al., 2007).
Additionally, gender differences within Cluster B revealed that female participants more frequently reported sleep interruptions (75% vs. 60% of males) and disruptions in school engagement. Conversely, negligible gender differences were observed in ClusterâŻA. Another emergent pattern: time of day did not significantly alter these clusters, nor did the main effects change when analyses focused on smartphone-only users (vs. multi-device users), indicating that sheer duration rather than device type drives stress risk. These patterns informed ChapterâŻ5âs theme of threshold-based digital resilience and nuanced support strategies for GenâŻZ.
5. Were any findings contradictory to existing literature?
Sample Answer:
Yes, certain results diverged meaningfully from established literature. For instance, previous research often portrays Generation Z as largely immune to stress from digital use, given their status as âdigital nativesâ (Barrot, 2018). Our findings, however, challenge that norm: we found that beyond a threshold of ~9â10 hours daily, digital literacy no longer protects against stress, and may even exacerbate it. This nuance undermines the blanket notion of effortless digital resilience.
Likewise, the doubleâedged nature of Internet use demonstrated in AbadâŻSantos et al. (2023) indicates that while online social support mediates positive mental health effects, excessive use still yields direct negative outcomes. In our data, although institutional social support buffered stress effectively, peer/family support did not significantly mediate as prior studies suggest, revealing a notable departure in the Philippine GenâŻZ context (Saleem et al., 2024).
Further inconsistency emerged in subgroup gender analysis: some previous studies emphasize stronger peer buffering among females, but our analysis found no significant gender interaction in support variables, suggesting socioâcultural differences in helpâseeking behavior. These qualitative deviations from prior expectations prompted a deeper literature review and are explored in greater depth in ChapterâŻ5.
6. How did you conduct the data analysis in your thesis?
Sample Answer:
The analytic workflow was structured as follows:
- Software: SPSSâŻ27 and Hayesâ PROCESS macro v4.1 for moderation and mediation analysis.
- Reliability & validity: Cronbachâs Îą for digital literacy (.89) and stress (.87); composite reliability >âŻ.85; exploratory factor analysis confirmed item loadings > .60; Harmanâs single-factor test showed one factor explained 34% of varianceâbelow the 50% benchmark, which minimizes concerns about common-method bias.
- Descriptive and correlational: Means, standard deviations, Pearson correlations (two-tailed), and normality checks (KolmogorovâSmirnov pâŻ>âŻ.05).
- Regression & moderation: Independent variables were meanâcentered. PROCESS ModelâŻ1 tested interaction terms (digital literacy Ă institutional support) with bootstrapped 95% CIs (5,000 samples). JohnsonâNeyman output identified regions of significance based on usage distribution.
- Cluster analysis: Used kâmeans clustering (kâŻ=âŻ2) to identify usageâliteracy typologies; Leveneâs and postâhoc tâtests confirmed cluster distinctions.
- Sensitivity checks:Â Stratified regression by gender and access tier; personnel categories were tested via interaction with control variables. The methods mirror technostress and moderated regression standards in education research (Saleem et al., 2024)
7. What evidence supports the validity and reliability of your measures?
Sample Answer:
ChapterâŻ4 presents rigorous validation of the instruments:
- Internal consistency: Both the digital literacy and perceived stress scales achieved Cronbachâs Îą >âŻ.87 and composite reliability >âŻ.85.
- Construct validity:Â Exploratory factor analysis confirmed that each item loaded above .60 on its respective factor, with no cross-loadings. Harmanâs single-factor test showed that the first factor captured only 34% of variance, below the threshold that suggests common errormethod bias.
- Criterion validity:Â Significant correlations were found between digital literacy and stress, as well as support variables, consistent with theoretical expectations.
- Moderation validation: The PROCESS bootstrapping method produced interaction terms whose confidence intervals remained entirely above or below zero, indicating robustness. Adjusted R² change (â.015) was statistically significant.
- Bootstrapped JâN regions confirmed differential effects above and below usage thresholds.
- Cluster validation: Followâup chi-square tests for demographic distribution (e.g., gender and income tier) across clusters revealed no significant bias (pâŻ>âŻ.10), supporting the internal homogeneity of clusters.
These techniques align with best practice in quantitive social science research and mirror frameworks used in technostress studies (Saleem et al., 2024), thereby ensuring confidence in the resultsâ reliability and validity.
8. What practical implications do your results suggest?
Sample Answer:
The findings suggest three practical interventions:
- Revised digital literacy training: Curricula should go beyond technical proficiency to include digital wellâbeing, emphasizing selfâregulation and awareness of usage thresholdsâespecially since literature shows that literacy alone does not prevent stress beyond ~9âŻhours of use daily.
- Institutional support frameworks: Educational institutions (senior high schools, colleges, and universities) should develop formal support mechanismsâe.g. onboarding orientations on healthy device habits, digital wellness workshops, and realâtime technical support desksâbecause institutional support significantly moderated stress in our findings (Saleem et al., 2024)
- Active peer mentoring and structured community groups: While online peer or family support showed limited buffering in our study, frameworks that channel this support into formal, guided peerâmentorship channels or teacherâled online communities may better harness the positive effects of online social support, consistent with the pathways identified by AbadâŻSantos et al. (2023).
These implications recommend that policy makers, school administrators, and mental health practitioners implement usageâaware design, combine formal support structures, and encourage structured peer mentoring to build resilience in Filipino GenâŻZ digital users.
CHAPTER 5 SAMPLE THESIS QUESTIONS AND ANSWERS
1. What is the overall summary of your thesis and key findings?
Sample Answer:
This study investigated the interplay between digital literacy and perceived stress among Filipino GenâŻZ university students. Descriptive results revealed elevated digital literacy (MâŻ=âŻ4.1/5, SDâŻ=âŻ0.7) yet moderate stress levels (PSSâŻââŻ25/40), suggesting a dual reality of capability and strain. Pearsonâs correlation indicated a significant inverse relationship between digital literacy and technostress (râŻ=âŻâ.43, pâŻ<âŻ.01), supporting the hypothesis that digital competence predicts lower perceived stress (Giray, 2024).
Multiple regression, controlling for income, access type, and discipline, accounted for 18% of stress variance (R²âŻ=âŻ.18, F(4,395)âŻ=âŻ21.7, pâŻ<âŻ.001), with digital literacy as a robust negative predictor (βâŻ=âŻâ.38, pâŻ<âŻ.001). Interaction analysis (Hayesâ ModelâŻ1) revealed that high literacy can buffer stress, but only when daily device use is below approximately 9 hours; above this threshold, the protective effect plateaued, and high usage began to correlate positively with stress (Kumar, 2024).
Cluster analysis (kâŻ=âŻ2) created two distinct typologies: ClusterâŻA with high literacy and moderate use (low stress), and ClusterâŻB with moderate literacy but excessive usage (>10âŻhrs/day), which had the highest stress scores. These findings systematically address all research questions, demonstrating not only that literacy is beneficial but also that screen-time boundaries are critical for well-being. These insights form the empirical backbone for the recommendations and conclusions that follow.
2. What are the theoretical and practical conclusions of your thesis?
Sample Answer:
Theoretically, this study strengthens a hybrid framework that integrates Ngâs digital literacy model with Kumarâs technostress theory, showing that digital literacy subcomponentsâespecially cognitive and ethical communication skillsâact as protective factors against negative emotional outcomes (Addai, 2024). These findings corroborate prior global research and extend them to the Philippine GenâŻZ context, highlighting nuances such as usage thresholds and the limited role of informal peer support.
Practically, the study offers several actionable conclusions:
- Curricula must elevate digital literacy beyond access to include critical evaluation, responsible communication, and online resilience training.
- Well-being interventions should incorporate digital wellness education, emphasizing safe usage limits and timeâmanagement practices.
- Long screen times (>9âŻhrs/day) negate the benefits of literacyâa reminder that healthy device habits are as key as competent use.
These results suggest that policymakers, universities, and counselors should not only promote digital skills but also embed usageâaware wellness programs. The evidence clearly indicates that digital literacy is necessary but not sufficient for emotional resilienceâusage habits and institutional supports play pivotal roles in studentsâ stress regulation.
3. What are your educational policy implications?
Sample Answer:
The study yields several direct implications for Philippine education policy at both secondary and tertiary levels:
- DepEd and CHED curricula should mandate digital literacy modules focusing on critical-source evaluation, content creation ethics, and mindful online habits. Evidence shows that increased literacy reliably lowers technostress (Giray, 2024).
- Policy frameworks should address screen-time pacingâfor example, mandating no-screen zones and designated deviceâfree hours during school days, drawing from digital wellness best practices that improve sleep and reduce stress (behavioral health studies).
- Resource allocation is needed for digital equity programs, especially in under-resourced schools and provinces. Socioeconomic factors moderated stress outcomes in the studyâstudents in low-income groupings with low literacy scores had disproportionately high stress (Kumar, 2024).
- Teacher training and support programs must be scaled to help educators model healthy digital habits and guide students in self-regulated digital use, as literacy-based instruction alone may not suffice.
Overall, the study supports a multi-tier policy approach that combines skill-building (literacy), behavioral guidance (usage habits), and structural support (access equity), thereby empowering Filipino GenâŻZ to be digitally literate and mentally resilient.
4. What recommendations do your thesis proposes for universities and educators?
Sample Answer:
For universities and educators, the following recommendations are grounded in the studyâs findings:
- Implement formal digital wellness programs such as orientation modules that teach timeâmanagement, platform hygiene, and healthy sleep routines. Studies show these institutional supports significantly moderate technostress (Saleem, 2024).
- Institute âdigital detox hoursâ or screenâfree periods during academic days. Research indicates that brief, structured detox (e.g., 15â30âŻminutes daily) improves focus, reduces anxiety, and enhances mood (Anandpara, 2024).
- Train faculty to deliver feedback through multiple communication channels (email, LMS, in-person) and to provide clear timelines, reducing cognitive load and ambiguity that contribute to technostress (Saleem, 2024).
- Establish peerâled selfâregulation mentoring groups, where upper-level students guide new entrants in healthy device habits and digital workflows. Although informal support wasnât significant in the study, structured peer mentoring better preserves positive outcomes.
- Monitor screenâtime data from institutional apps or LMSÂ to track usage patterns and target students who exceed stressâthreshold limits. Personalized messages or interventions can then be sent.
Taken together, these measures ensure that literacy training is complemented by usageâaware supportâeffectively translating competence into lasting digital resilience.
5. What recommendations do you suggest for students themselves?
Sample Answer:
Students are the primary agents of changeâhereâs how they can apply the studyâs insights:
- Adopt intentional device use habits such as implementing nightly âdigital detoxesâ that avoid screen exposure at least one hour before bedtime. Studies confirm that reducing late-night screen time improves sleep quality and lowers anxiety (verywellmind, 2025).
- Engage in selfâregulated learning practices, including goal setting, time blocking, strategic platform-switching, and regular reflection. Research demonstrates that higher digital literacy enhances these SRL behaviors, which in turn reduces academic stress (Chen, 2025).
- Limit multitasking on multiple apps, particularly during study sessionsâmonotasking boosts focus and mental clarity (digital wellness insights).
- Join or form peer study groups focused on digital well-being, leveraging social accountability to regulate usage habitsâstructured groups have shown more impact than informal support alone (positive psychology principles).
- Track personal screen-use patterns using builtâin device observability tools, identifying triggers for prolonged usageâawareness leads to behavior change.
Combining digital literacy skills with self-regulation and offline checks, students can maximize the benefits of technology while maintaining emotional balance.
6. What should mental health and support services do?
Sample Answer:
Mental health and counseling services have a key role to play by integrating digital well-being into their programs:
- Develop culturally adapted digital mental health literacy campaigns, similar to the pilot âTara, Usap Tayo!â initiative, to educate youth about technostress, coping strategies, and help-seeking behaviors (Martinez, 2022).
- Offer online and hybrid counseling options. While digital-first support platforms can be effective, their impact depends on structured follow-throughâa study found limited empirical evidence for unstructured tools in boosting mental health outcomes (Inside Higher Ed report).
- Establish peer-support networks, guided by supervision, to create safe spaces for confiding about device-related stressâpeer mentoring models reduce isolation and encourage proactive behavior (peer support research).
- Coordinate with student wellness programs to provide periodic âtechnology check-upâ workshopsâwhere students review their screen-time trends and deliberate coping plans, as recommended by the National Academy of Scienceâs campus well-being frameworks (SU Committee on Mental Health, 2021).
- Embed psychometric screening for high-risk users, such as those reporting >9 hrs of daily device use or high stress scores, and refer them to targeted strategies including digital boundaries, sleep hygiene, and mindfulness.
These steps mainstream digital mental health into campus wellness infrastructure, bridging literacy skills with emotional support and behavioral health expertise.
7. What are the limitations of your study, and how can future research address them?
Sample Answer:
Several limitations shape the interpretation of this study and guide future directions:
- Crossâsectional design restricts causal inference. While associations between literacy and stress are clear, longitudinal or experimental designs (e.g. preâpost intervention studies) are needed to ascertain directional effects.
- Sampling bias arises from convenience sampling at three Metro Manila universities, which may not represent GenâŻZ populations in rural areas or islands. Future studies should use stratified probability sampling across Philippine regions.
- Selfâreport measures may be affected by social desirability or recall bias. Future work could include objective logs (e.g. screenâtime reports from phones) and physiological stress markers (e.g. heartârate variability).
- Platform evolution as a moving target: social media and app norms change rapidly, so findings from 2025 may not hold beyond 2027. Future research should consider periodic replication.
- Limited scope of stressâbuffering variables: informal peer/family support was not significant, contrary to older literature. Mixedâmethods studies could explore sociocultural nuances behind support dynamics.
- Threshold specification: the ~9âhour usage cutoff emerged post hoc. Future experimental studies can test whether specific screenâtime limits reliably buffer the literacyâstress link.
- Absence of mental health clinical data: findings apply to nonâclinical college populations only. Including students with diagnosed anxiety or depression could identify differential effects.
The future research can refine theoretical models, inform culturally tailored interventions, and expand generalizability across broader demographic contexts.
8. What is your final concluding message to the panel?
Sample Answer:
In closing, this study offers a clear and resonant message: digital literacy and digital well-being must go hand in hand. High levels of digital skillâespecially in critical thinking, ethical use, and secure communicationâsignificantly reduce perceived stress in Filipino GenâŻZ students. Yet, this protective effect only holds when device usage remains within moderate boundaries (ideally under 9 hours daily). In essence, litÂeracy without limits, or usage without self-awareness, leads to digital vulnerability.
Thus, empowering GenâŻZ requires more than teaching shortcuts and platform hacks; it demands instilling a balanced approach to technologyâwhere self-awareness, mindful usage, and institutional support reinforce each other. Whether through improved curricula, wellness policies, or personal routines, the pathway to digital resilience is paved by merging competence with care.
By mobilizing these insights, universities, policymakers, counselors, and students themselves can create healthier digital ecosystemsâones that harness technologyâs strengths without sacrificing mental well-being or academic vitality. In a world increasingly dependent on screens, digital empowerment is not just a skillâit is a safeguard.
Thank you, panelâit has been a privilege to present these findings and contribute to the well-being and literacy of GenâŻZ inthe Philippines.
If you need help with your research editing, proofreading, and data analysis, you can email us. Follow our Facebook for discount codes.





0 Comments