Oral Presentations: Five
Advancing AI Classification for Public Responses to Crisis Communication
Analysis of the Rosenzweig-MacArthur model
Schur Rings over Numerical Semigroups
RedactaRAG: Strengthening Policy-Based Security in LLM Interactions
Developing Methods of eDNA Detection of Snakes
Montenegro's Medium (MM): serum free, low cost axenic medium for Naegleria fowleri
Thinking Outside the LOX: Mechanisms of Meningeal Angiogenesis
Weekend Getaway? Evaluating the "Weekend Effect" in US Wildlife
The Characterization of Metal Catalysis and Cooperativity of Histidine Ammonia Lyase
A Fluorescence Imaging System for Environmental Microplastic Analysis
ZnNi Thin Film Morphology and Thickness on Manufactured Surfaces
pKa Characterization of the Novel Antioxidant, 3-hydroxythiophene-2-carboxylic acid
Sleep Efficiency's Impact on Reaction to Stressful Stimuli in College Students
The Problem With Subspecies in Hominid Taxonomy
The Effect of Digital and Analog Recorded Sounds on EEG Gamma, Theta, Delta Asymmetry and Affect
Mammalian Baselines Through Time in the Bear River Range
Effects of Winglet and Airfoil Geometry on the Lift-to-Drag Ratio of an Airfoil
Collegiate Rocket Apogee Control System Selection
Ethical and Technical Challenges AI-Driven Spam Detection
Topography-Aware Modeling of Predicted Wind Velocities Around Sevier Dry Lake
Poems On Postcards: May Swenson's Travel Journal
Mary Shelly's Frankenstein, Longinus's Sublime, and the Transportation to symbolism of Christian Doctrine.
Speculative Fiction: The Unrecognized Tool for Social Change
From Ashes to Appointment: The Populist Cycle of Fall, Exile, and Glorious Return in Latin America
Effective crisis communication can save lives yet public responses vary widely ranging from calm logical engagement to emotional reactions. This research investigates how people responded to COVID 19 crisis communication on social media focusing on the influence of message content and source on engagement patterns. The central question asks: How do people respond to crisis communication logically or emotionally and what shapes that response? We began by mapping the content of crisis messages using topic modeling categorizing tweets into topics such as risk warnings or protective action instructions. This structured view revealed differences in message types and guided subsequent analysis of public responses. We hand labeled 2,000 tweet replies as Discussion representing logical factual engagement or Reaction representing emotional personal comment and initially trained a Gaussian Process classifier, achieving approximately 62 percent accuracy. This highlighted the complexity of classifying responses at scale. To enhance analysis, we first applied Meta LLAMA to detect interpretable proxies such as numbers statistics and money mentions which proved highly accurate. These proxies allowed scalable measurement of logical versus emotional engagement across the dataset. Building on this success, we next applied LLAMA directly to classify tweets as Reaction versus Discussion. This approach has now achieved approximately 80 percent accuracy a significant breakthrough over earlier methods and demonstrates the potential of advanced AI for scalable social media analysis. We are currently working to further improve accuracy and create an API that will allow real time classification of public responses to crisis communication. This tool represents a major advancement for public health agencies and policymakers seeking actionable insights. Using proxies and topic analysis we observed that risk related messages generated more statistics based replies indicating logical engagement while protective action messages elicited more emotional responses. Source analysis revealed that politician accounts generated more money focused replies reflecting economic concern whereas government accounts particularly federal elicited more logical engagement. Our findings demonstrate that combining interpretable proxies with advanced AI classification provides scalable insight into social media engagement. This research offers a practical tool for crisis communicators to tailor messaging and represents a novel application of AI to improve public health communication strategy..
The Rosenzweig–MacArthur Predator-Prey Model is a refined version of the Lotka-Volterra Predator-prey Model that also incorporates two important biological factors which are carrying capacity of the prey and predator saturation. We reduce the number of parameters in the model using non-dimensionalization, identify steady-state solutions and study their stability to understand the conditions that permit both predators and prey to co-exist. We develop a new nonstandard numerical method which is unconditionally stable and preserves the positivity of populations. We also compare it with traditional numerical methods to show the new method is more computationally efficient.
Numerical semigroups arise in a natural way in the study of nonnegative integer solutions to Diophantine equations with natural number coefficients. A Schur ring is a partition of an algebra which itself behaves algebraically. Like subalgebras and quotient algebras, Schur rings find algebraic structure by blurring the operations of the ambient algebra, which counterintuitively clarify the algebraic structure that exists there originally. Marrow and Misseldine have generalized the notion of a Schur ring toward the category of semigroups, that is, sets with some qualities of groups, like associativity, but that lack the full divisibility that groups require. Numerical semigroups are semigroups formed by addition of natural numbers. This presentation will discuss the Schur rings over numerical semigroups. We are motivated by three questions. First, the work of Marrow and Misseldine shows that the most important Schur ring over a semigroup is the coarsest one. We classify the coarsest Schur ring over a numerical semigroup based upon the parameters of the semigroup. Doing this provides insight into the second question, which is a conjecture due to Misseldine, “All Schur rings over a numerical semigroup are eventually discrete.†A classification of the coarsest Schur ring provides a definitive answer to Misseldine’s conjecture. Thirdly, with the coarsest Schur ring classified, we are often able to enumerate all Schur rings over a numerical semigroup, again by the parameters of the semigroup. Progress has been made in the classification of Schur rings on specific types of numerical semigroups. Specifically, an explicit formula has been found which generates the maximum possible element of a partition on a numerical semigroup of two generators. Additionally, we have shown that for a numerical semigroup generated by consecutive natural numbers from m to 2m-1, that there is no partition on any element above 2m-1.
The rapidly increasing strength of Large Language Models (LLMs) has raised many questions about their role in education. Models such as ChatGPT and Google Gemini offer powerful capabilities, but their integration into academic settings requires careful consideration. One current approach, Retrieval-Augmented Generation (RAG), enables users to provide documents to LLMs which then respond using those documents. However, traditional RAG methods lack flexibility and require strict attention to data security, so they should not be used in secure educational environments. To address these limitations, we present RedactaRAG, a modification of traditional RAG methods that enforces policy-based responses. RedactaRAG uses a knowledge-graph database containing nodes with different documents and pieces of information. Attached to these nodes are policies that instructors can put in place; for example, an instructor could write a policy that prevents discussion of certain pieces of information related to a document. These policies are flexible due to the knowledge-graph structure, allowing them to be applied selectively to certain users or assignments, or even applied globally to all interactions with the LLM. RedactaRAG contains three main components: A front-end model that communicates with the user A back-end model that retrieves documents, filters them according to attached policies, and returns the documents to the first model An evaluator model that ensures the first model's response aligns with the given policies Preliminary tests of this system have been focused on appropriately filtering extracts of Python documentation. Policies were given to our model that allowed acknowledgment of functions, prevented discussing proper syntax of a function, or did not allow discussing a function. In all tested scenarios, the context was appropriately filtered and responses were given in accordance with the given policies. Further testing on a large scale will be done using LLM-as-a-judge techniques to show its full capability. By creating a network of LLMs, we ensure that attached policies are effectively enforced. This provides effective security in RAG systems that is resistant to prompt engineering and data leakage. Instructors can use RedactaRAG in their courses to allow effectively supervised usage of LLMs on exams or assignments. Chatbots that need to access certain data can use RedactaRAG to protect potentially personally identifiable information. RedactaRAG is built in a way that is secure and scalable with future technology growth, so it can be continuously used in a rapidly-growing software ecosystem.
Snakes play an important role in their respective habitats. However, they are difficult to find via traditional survey methods due to their elusive and cryptic behaviors. Limited detection of these species hinders accurate population estimates and consequently makes developing conservation protections more challenging for them. Here we demonstrate the viability of using environmental DNA (eDNA) from soil and cloth to determine the presence or absence of the elusive snake species, Coluber constrictor. We developed new methods in eDNA extraction to attain a more accurate and less invasive assessment of snake populations. Forty wooden coverboards were deployed in Utah’s Dinosaur National Monument as artificial shelters for native snake species, which use them for thermoregulation. Each board was surveyed weekly. When a snake was observed under the board, it was recorded, and soil was collected from beneath the board. The soil was then taken back to the lab and isolated using a Qiagen PowerSoil isolation protocol. Negative controls were taken from boards where snakes were not seen. Large cloths were also attached to the undersides of the cover boards for approximately 4 weeks during our surveys. Those were collected and processed using aquatic eDNA techniques and a modified blood and tissue Qiagen extraction protocol. PCR on the soil samples revealed banding similar to our positive control DNA from Coluber constrictor, but sequencing showed mostly Gammaproteobacteria rather than snake DNA. This suggests that DNA from soil bacteria may overwhelm snake DNA if it is present; therefore, this method needs further refinement. In contrast, 3 of 40 cloth samples (7.5%) showed banding consistent with Coluber constrictor and were confirmed using restriction digest. Traditional surveys detected snakes at 8 of 40 coverboards (20%). While eDNA detection was lower, this study has shown that eDNA extraction from cloth is feasible, and if refined and coupled with traditional surveys, allows for improvement of detection, particularly if the cloths where DNA was detected were unique from boards with positive visual detection.
We describe a simple serum-free medium (Montenegro's Medium, MM) that supports axenic growth of Naegleria fowleri. MM uses common ingredients, no antibiotics, and standard sterile technique. Mean cell density rose from 2.05×10âµ to 1.72×10â· cells·mLâ»Â¹ across 168 h. A single-phase fit across 0–168 h gave a doubling time near 28 h; early windows were faster (≈12 h for 0–24 h; ≈9 h for 24–48 h). Fresh batches outperformed 30–35-day batches at 48 h (≈1.55×) and 72 h (≈2.77×). Direct transfer gave the best yield when passaged on Day 3 (TD3); TD4 was similar across methods, and TD5 showed overlapping ranges. All work traced to one starting batch (2.05×10âµ cells·mLâ»Â¹ at T0) expanded in three source flasks and subcultured more than fifty times without loss of routine growth. A price check puts MM at cents per 100 mL, while serum-dependent formulas fall in the $8–$23 per 100 mL range. MM is a practical option for labs that culture N. fowleri.
The meninges are a multilayered, vascularized set of membranes that surround and protect the Central Nervous System. Despite life-threatening pathologies such as meningitis and meningiomas accounting for nearly 250,000 deaths each year, we have a limited understanding of meningeal biology. Our lab studies Fluorescent Granular Perithelial Cells (FGPs), a meningeal cell population analogous to mammalian perivascular macrophages (PVMs) that associate with meningeal blood vessels. Notably, FGPs express the lysyl oxidase (LOX) enzyme Loxl1, known to induce vessel formation (known as ‘angiogenesis’) in other systems. Previous treatment of zebrafish embryos with the pan-LOX inhibitor β-aminopropionitrile (BAPN) resulted in decreased vascular density, leading us to hypothesize that FGPs contribute to meningeal angiogenesis via Loxl1 in developing zebrafish. However, BAPN is known to cause off-target effects in several tissues, including blood vessels. To prevent this, we treated transgenic zebrafish embryos with labeled blood vessels and FGPs against different doses of the pan-LOX inhibitor PXS-4787, which is less prone to cause off-target effects. We then imaged the treated embryos using confocal microscopy, allowing us to quantify meningeal vascular density changes from z-stacked fluorescence images. Preliminary results indicate a significant decrease in meningeal vascular density after PXS-4787 treatment, similar to our BAPN findings. This suggests that LOX activity is linked to meningeal vasculature development and is not just an off-target effect of BAPN treatments. By dissecting the roles of FGPs and LOX activity in meningeal vascular development, our work aims to address the longstanding gap in knowledge about the mechanisms underlying meningeal vascular regulation.
As the world's human population continues to concentrate within urban areas and these landscapes continue to expand worldwide, wildlife is under pressure to adapt to novel environmental disturbances. Along wild-to-urban gradients, and especially within less developed areas, human recreation can affect wildlife behavior. Using a dataset spanning the entire contiguous United States from 2019-2021, I assess the effects of periodic increases in human recreational activity during the weekend on wildlife temporal activity behavior, distribution, and breeding activity. This dataset contains 53 terrestrial mammal species that were restricted to mammals that are easily distinguishable from youth to adults in trail camera photographs, specifically large ungulate and carnivore species. We assessed whether periodic increases in human recreational activity during the weekend elicited a change in mammalian temporal activity, behavior, and breeding patterns at the community-wide and species-specific level. We found little evidence supporting the presence of a temporal ‘weekend effect’. At the community-wide scale we saw that mammal diel activity patterns did not change in response to periodic increases in human recreational activities on the weekend. On the species-specific scale, only big game species like elk (Cervus canadensis) and the American bison (Bison bison) have significantly altered temporal activity patterns during the weekend. However, people significantly altered temporal activity patterns during the weekend, with more activity occurring in midday and less in the early evening, leading to consistent decreases in human wildlife temporal overlap. This study highlights the possibility of altered human activity patterns during periods of increased rates of activity serving as a human-wildlife conflict mitigation strategy.
Histidine ammonia-lyase (HAL) is an enzyme that participates in the degradation of L-histidine by catalyzing the elimination of the alpha-amine to form trans-urocanate. In its active site, HAL contains an unusual cofactor 4-methylidene-imidazol-5-one (MIO) that is formed by post-translational backbone cyclization involving three residues (Ala-Ser-Gly). A solvent-exposed pocket near the MIO group is the likely substrate binding site. Although HAL has been studied for many years, open questions about its mechanism remain, including which substrate group acts as a nucleophile and whether divalent metals are involved in catalysis. Experimental and computational analyses have proposed the presence of metal ions to be valuable in enzyme catalysis. However, different researchers have produced contradictory results about the importance of metal ions and the effects of metal chelators on the enzyme’s activity. We are measuring the kinetic impacts of adding metals, such as manganese, nickel, cobalt, and magnesium, to observe changes in activity to determine if metals do play a role in the HAL mechanism. In addition, HAL is known to be allosterically regulated in bacterial cells, but in most kinetic studies, no cooperativity with regard to substrate or inhibitor binding has been observed. We have observed both cooperative and noncooperative behavior in separate purifications of HAL. Our current work aims to map the effect of small molecule inhibitors on the cooperativity of the enzyme.
As global plastic production continues to rise, the prevalence of microplastics in natural environments has become a pressing ecological concern. Understanding the composition and distribution of these microplastics is essential for identifying pollution sources and developing effective mitigation strategies. However, traditional methods for isolating and characterizing microplastics are often slow and labor-intensive, particularly when samples contain large amounts of organic or mineral material. This project presents the development of an automated fluorescence imaging system designed to streamline the identification and quantification of microplastics collected on filter paper. The system employs hydrophobic fluorescent dyes that selectively bind to plastic particles, allowing for a simple way to differentiate between plastics and surrounding environmental debris. Using MATLAB, the program automates image acquisition, stitching, and analysis to generate a coordinate map of the filter paper. This map provides a standardized reference frame that enables researchers to efficiently catalog, revisit, and further analyze regions of interest without redundant imaging. By reducing the need for manual identification and data organization, this imaging workflow increases the efficiency and reproducibility of microplastic analysis. Ongoing work focuses on integrating Raman spectroscopy into the system for material-specific identification. By combining fluorescence-based localization with Raman spectra, the system aims to enable rapid wide-field detection followed by targeted compositional analysis. This approach reduces the need for full-field Raman mapping, increasing efficiency while maintaining material specificity. Overall, this work contributes to the advancement of optical techniques for environmental monitoring. The proposed fluorescence imaging system offers an efficient approach for studying microplastics, supporting efforts to better understand and mitigate the growing impact of plastic pollution.
Thin films are important in many technologies, from electronics to protective coatings. The way these films grow can change depending on the shape of the surface they are deposited on. In this project, we studied how a ZnNi thin film deposited onto three different surfaces: a flat plate, bolts, and a screw. Using scanning electron microscopy (SEM), we examined the overhead surface morphology and measured film thickness through cross-sectional imaging. Energy dispersive X-ray spectroscopy (EDS) was also used to determine the chemical composition of the film. Comparing a cross-section of the flat plate to the deposition on more complex surfaces, we observed differences in how the thin film covered the surfaces and how its thickness varied. This study shows how surface geometry influences thin film growth and provides insights for improving thin film deposition on non-flat materials.
The objective of this project was to determine the acid dissociation constant (pKa) of 3- hydroxythiophene-2-carboxylic acid (HTC), a bicyclic sulfur-containing compound that has recently characterized antioxidant properties. The pKa is related to acidity and bond enthalpy of HTC, which affects its antioxidant properties. Measurement of the pKa was accomplished using sodium hydroxide (NaOH) solution, standardized using potassium hydrogen phthalate (KHP) as a primary standard. A small amount of HTC (less than 0.01 g) was dissolved in dimethyl sulfoxide (DMSO) and diluted into deionized water with sodium chloride (NaCl) to improve solubility and maintain constant ionic strength. The NaOH titrant was then added slowly while the pH was measured with a calibrated electrode. The titration data were analyzed using 1st derivative digital filters and Gran plot methods to find the equivalence point and calculate the pKa. Results showed that HTC likely exists in two forms in solution, with different hydrogen bonding orientations. It acts as a weak acid with a pKa of both forms of about 3.67 and 4.00, supporting its potential role in antioxidant reactions involving proton transfer and electron movement.
Around 50% of college students report daytime sleepiness and 70% report insufficient sleep (Hershner & Chervin, 2014). Kingshott (2024) found that after two weeks of sleeping six hours or less per night, students perform as poorly as those deprived of sleep for 48 hours. Sleep deprivation affects physiological stress markers, elevating blood pressure, increasing skin conductance, suppressing immune function, and altering endocrine activity (Massar et al., 2017). These compounding effects can severely harm physical and psychological health (Madore et al., 2020; Wright et al., 2015). Although prior studies have linked sleep, health, and stress (Cappuccio et al., 2011; Massar et al., 2017), few have examined both self-report and physiological data together. Combining subjective and physiological measures provides a fuller understanding, since individuals’ perceptions of sleep may not match objective quality. Research also lacks data on both male and female college students, as most studies focus on adults or adolescents (Ly et al., 2015; Massar et al., 2017). This study aims to address those gaps and clarify how sleep and stress interact. Acute stress—short-term physiological or psychological reactions—can elevate blood pressure, heart rate, and Galvanic Skin Response (GSR). Poor sleep efficiency has been linked to greater reactivity to acute stress (Wright et al., 2015). Chronic stress exposure damages health over time, making sleep quality essential for stress resilience, especially among highly stressed college students. GSR, also called Skin Conductance or Endothermal Activity (BIOPAC, 2013), reliably measures physiological stress through electrical changes in skin tissue driven by Autonomic Nervous System (ANS) activation (Martinez et al., 2019). In this study, GSR will be recorded with a BIOPAC MP36 system to measure physiological stress during a simultaneous arithmetic and memory recall task. This noninvasive, low-cost method provides accurate, continuous data on ANS activity and stress responses. Actigraphy, a wearable sleep tracking method, measures body movement, light, and activity over several days using a device such as the ActiGraph wGT3X-BT (Amertrist Inc., Pensacola, FL). It detects restlessness and movement during sleep, providing detailed patterns of sleep quality. Participants will wear the device continuously for seven days, except while showering, and return it for data collection during a Wednesday stress test. Scheduling data collection midweek minimizes weekend variability since students sleep longer and more irregularly on weekends (Fudolig et al., 2024; Vestergaard et al., 2024).
In scientific communities, including the field of anthropology, it is essential to communicate effectively. This cannot be done when the definition of species is vague, and the concept of subspecies is not universal. Every element of taxonomy must be defined with great specificity. However, since evolution is a spectrum, rather than a singular chain, creating taxonomic boundaries is complicated. Currently there are several ways to define a species. There are phenotypic, genotypic, and distribution strategies. The other way to define a species is whether they can reproduce. Human species are generally differentiated based on behavior or physical characteristics such as brain size, dexterity, or possession of language. Many human species are considered to be subspecies, due to their similarities to other species. This is especially true for Denisovans and Neanderthals. When separating species, intraspecific variation must be considered. However, some subspecies variation is very broad, and some variations are negligible. In this essay, I argue that there must be an exact point at which natural variation constitutes a separate species. The boundaries for species classification must be explicit. I also argue that it is critical to decide, cohesively, whether subspecies is a valid method of classification. I conclude that based on how vague the current definition of subspecies is, and also how vague the definition of a species is, it is inefficient. Because it is inefficient, the scientific community, and specifically the anthropological community should not use the classification of subspecies when referring to populations of hominid species. Rather, the definition of species should be more specific, in which case, subspecies classification will become irrelevant.
Concerning effects of screentime on well-being has garnished attention among researchers, medical professionals, and the lay public. Attention has largely focused on the content watched, a corresponding decrease in meaningful face-to-face contact, digital multi-tasking, unhealthy levels of instant gratification, exposure to blue light, and poor-quality sleep. A factor no one has considered outside of our lab is the sound that digital devices emit. Live analog and recorded analog sounds are indistinguishable in their physical properties (i.e., electrical signatures and frequency ranges). Digital sounds are converted into analog format from a digital file such as .mp3 or .wav. Importantly, digital sounds are not identical to their live analog or recorded analog counterparts. Digital sounds have a unique electrical signature, do not represent the full range of frequencies of the initial sound source, and are accompanied by digital noise that can be minimized and masked, but not eliminated. No other sound produced in nature shares these properties. As such, digital sound may affect the brain and behavior in yet undiscovered ways. Here, we report on two experiments. In experiment one, participants listened to a 5-minute recording of a babbling brook in analog (i.e., cassette tape) or digital (i.e., SD card) format. EEG was recorded from 32-channels throughout the 5-minute audio exposure. Participants completed the PANAS mood inventory before and after the audio session. Power spectral densities were exported and analyzed within the gamma, theta, and delta bands. In experiment two, participants completed the BMIS mood scale. They were then randomly assigned to one of six treatment combinations. Specifically, they listened to a digital (SD card) or an analog (cassette tape) recording of either the song Hotel California (The Eagles, 1976), a babbling brook, or zebra finch chirps. Five minutes later and with the sound still playing, they completed the BMIS a second time. The data is currently under analysis, and the results will be presented at the conference.
The effects of anthropogenic climate change on contemporary animal communities are a key concern for climate and environmental experts going forward. Importantly, however, to fully understand the potential impacts of climate change on zoological life today requires an understanding of how animal populations responded to past climatic changes that occurred over timescales exceeding those of direct human observations. Paleozoological data provides such a record and documents baselines of animal communities that can be used to evaluate historic anthropogenic change and attest to the responses of species to ecosystem changes over geological timescales. This project builds paleo and modern zoological baselines for Western North America’s Bear River Basin, straddling modern-day Utah, Idaho, and Wyoming, using three data types: 1) Paleontological survey of two high-elevation cave assemblages (Boomerang and Thundershower Caves), 2) modern camera trap data, and 3) modern museum live trapping surveys. Using these baselines, we compare and evaluate changes in species diversity and richness over time. Additionally, using Random Forest statistical analysis, we highlight the analytical power of machine learning for understanding variation in species that exists among the datasets. Finally, we discuss the implications of our findings for Tribal-led ecological restoration efforts taking place in the region.
Aircrafts have played a crucial role in our world since their invention. Among their most critical components are the wings and winglets, whose aerodynamic characteristics (lift generation, drag reduction, fuel efficiency, structural loading, flight performance) directly influence the flight efficiency, stability, and overall operational cost. This ongoing research aims to investigate the aerodynamic performance across the multiple angles of attack, Reynolds numbers, and winglet configurations, and help to make a direct comparison between different geometries under controlled flow conditions. The research also aims to develop the aerodynamic performance and aerodynamic efficiency characteristics of commercial aircraft wings and winglets in a wind tunnel by focusing on the ‘lift drag ratio’. This will be accomplished by setting a base point from observations and data from previously published research. Referenced articles will be cited in the paper and presentation. Experiments are being conducted using various scaled wing models with different airfoils made by 3D printers and PLA filament and a measurement setup that includes two load cells connected to a sting mount. The data from the cells will be processed by a microcontroller and used to make observations and calculations. The conclusion of this study will provide detailed insights on how the wing geometry can be optimized to increase overall aerodynamic efficiency. The results of the study should help the industry with future development of the aerospace sector, as it will increase understanding of how the wing geometry and winglet configurations can be improved to produce a better overall performance. A side benefit may also be better geometry for aesthetical concerns.
Brigham Young University’s Rocketry Association competes annually at the International Rocket Engineering Competition (IREC), where achieving a precise apogee is a critical performance metric. To improve altitude control for future launches, the Guidance, Navigation, and Control (GNC) team is developing a closed-loop apogee control system with a precision goal of ±2 m. This work evaluates several potential control methods — fin actuation, thrust vector control (TVC), thrust cutoff, variable-mass water ballast, and deployable airbrakes — based on control authority, system mass, and modeling complexity. MATLAB simulations of first-order dynamic models were used to estimate altitude response to actuation inputs and assess the practicality of closed-loop control for each concept. Rough mechanical models were also developed to determine feasibility of manufacture, system complexity, and mass estimates. Fin actuation and thrust vector control were deemed too complex to simulate accurately and unhelpful to the objective of apogee control. Thrust cutoff had enough control authority, but was determined to require sub-second precision and beyond our sensors’ capability. The control authority of water ballast scales with mass, and over a kilogram would have been required, significantly reducing altitude performance. Several airbrake architectures were considered, including multibar mechanisms axcutating convex flaps, but simple tab-based systems offered sufficient authority and precision with minimal mechanical complexity and tradeoff. Deployable tab-based airbrakes therefore offered the best tradeoff of mass, control authority, and complexity of design and manufacture, and were therefore selected for prototype development. This study provides a framework to inform the design of future precision apogee systems for collegiate rockets. Ongoing work focuses on refining simulation accuracy, designing the actuation mechanism, and experimentally validating performance through prototype testing planned for April 2026.
This research investigates the ethical challenges surrounding AI-driven spam detection, focusing on the central question: How can spam-filtering technologies protect users from harmful digital communication while respecting consent, privacy, and moral responsibility? The purpose of the study is to analyze how modern spam-detection systems work, why they sometimes fail, and how ethical frameworks—such as utilitarianism and deontology—can guide the design of safer, more responsible digital communication tools. The research draws on a mixed methodology that includes technical experimentation, ethical analysis, and real-world case study review. Practically, the study is grounded in the development of a simple spam-classification model built as part of a web-programming project. The model categorized messages based on trigger words, link frequency, and manually constructed rules. Observations from this project were compared with real incidents reported in news and technical analyses, such as the WinRed spamtrap controversy, where legitimate emails were blocked or rerouted due to poor data practices and inadequate respect for unsubscribe requests. In addition, insights from academic literature and industry reports were used to evaluate how spam filters impact users, how attackers exploit email systems, and how automated communication tools can unintentionally violate user consent. The study concludes that AI-based spam detection is both essential and ethically fragile. While spam filters prevent the majority of cyberattacks—many of which begin with phishing emails—they often misclassify legitimate messages, disrupt communication, or rely on questionable data-collection practices. Ethical analysis shows that utilitarian benefits alone cannot justify invasive or poorly designed systems, and that a deontological commitment to respecting user consent must guide the development of digital communication tools. The significance of this research lies in demonstrating that technical accuracy and ethical responsibility must operate together. Systems like email marketing tools, fundraising platforms, and automated messaging services must prioritize transparency, respect unsubscribe requests, audit their data practices, and communicate clearly with users. Ultimately, this study suggests that the future of spam-detection technology depends on designing systems that are not only technically effective but also morally accountable—protecting users while honoring their autonomy and digital consent.
Multiple factors influence dust emission in Utah, including wind velocity, soil properties, and vegetation characteristics. To better understand the distribution of dust in Utah and its subsequent effects on Utahns, it is important to accurately characterize these factors. Wind velocity is the most important variable governing dust emission from erodible surfaces; a doubling in wind velocity is associated with an eightfold increase in dust emission from such a surface. In Utah, basin-and-range topography can influence wind velocity through channeling processes. Localized basin-and-range topography is often too small to be resolved by traditional mesoscale modeling, and so its effect on wind velocity is not well represented. This research examines the effects of fine-resolution, topography-aware modeling on wind velocity. To resolve topographic effects, a mass-conserving diagnostic wind model called WindNinja was chosen. WindNinja takes as inputs an initial wind velocity field and a static, fine-resolution (e.g., 30 m) digital elevation model (DEM). Using a conservation of momentum solver, WindNinja then calculates a wind velocity field with finer resolution (e.g., 150m). For this study, a known dust event (April 18, 2023) was considered, with the region of interest as Sevier Dry Lake, Utah. The initial wind velocity field (4 km resolution) was obtained from the Weather Research Forecasting model (WRF) and the DEM was sourced from Shuttle Radar Topography Mission (SRTM) data. The computational domain considered was 200 km by 200 km and was consequently divided into subdomains (tiles). A WindNinja simulation was run for each tile, and the results were then recombined into a single mosaic. The wind velocities predicted by WindNinja display similar spatial patterns as the initial field from WRF, with significant localized differences. WindNinja velocities were locally higher due to topographic effects—for example, wind velocities at ridgelines were up to double those from WRF. The location of these velocity differences is primarily determined by the interaction of wind direction and topography (i.e., terrain orientation). By capturing the effects of Utah’s topography on wind speed through modeling, spatial profiles of dust emission can be better characterized. Especially for dust sources located near areas with abrupt topography changes, the difference in predicted wind speed could dramatically affect estimated dust emissions. Future application of these results could be used to improve understanding of dust emission from localized sources such as gravel pits and aggregate mining operations.
One of the most prolific poets of the 21st century was May Swenson (1913-1989), whose deep understanding of and connection to the world brough forth poetic innovation. She saw the world uniquely and was able to not just observe but chronicle that perspective in a way that has cemented her name in the American poetry canon. Swenson was a recipient of many literary awards, including a Guggenheim fellowship amongst others, as well as the winner of the Shelley Memorial prize. In 1960, she travelled across Italy and France with her partner. Throughout her travels, she kept a daily postcard journal, keeping record of what she saw and did each day. Excerpts and sometimes full entries on these postcards were transformed into poems. This project maps her journey across Italy and France, along with mapping her inspiration to the poems she published in the wake of her trip, in order to discover more about the process of her writing and observations. This is done through the study and transcription of her postcard journal, which can be compared against her published poems such as “Instead of the Camargueâ€, “While Sitting in the Tuileriesâ€, “Above the Arnoâ€, “The Pantheon, Rome†and more. Through this study of her process, not only do we gain valuable insight into the way Swenson studied the world around her, we also may find poems yet undiscovered, hidden within her postcard journals. In bringing more of Swenson’s transformative words to light, we can further illuminate her thinking to better emulate not just her writing, but her way of viewing the world.
Mary Shelly’s “Frankenstein†is a spectacle not only in Gothic fiction literature but Sci-fi as a genre. Mary Shelly’s Frankenstein becomes a novel that symbolizes God the creator and the First man, Adam. In Longinus’s On the Sublime, he names five key elements that contribute to the sublime; those are the greatness of thought, the strong and noble emotion, proper use of figures of speech, noble diction, and skillful composition. The sublime is a concept that has shaped and changed western thought and this paper is an advocate for the sublime being a rhetorical device that takes a reader to specific places and spaces of thought. The discourse around Mary Shelly’s novel is that it is a novel about nature vs. nurture and the relationship between a parent and a child. Through the sublime, the novel changes from a bad parenting book to a theological representation of God the creator, and his first man, Adam. A specific part that contributes to this argument is when Victor and his Creature are communicating about another creation for the creature. The creature states “I demand a creature of another sex, but as hideous as myself: the gratification is small, but it is all that I can receive, and it shall content me. It is true, we shall be monsters, cut off from all the world; but on that account we shall be more attached to one anotherâ€(Shelley 1818). The creature is going to cut him and his wife “off from all of the worldâ€(Shelley 1818) the same way that God casted out Adam and Eve from the Garden of Eden. This implies that the world ,that is just the world for Victor, is the Garden of Eden for his creature. The sublime is in effect by making the world seem vast and grand enough for Victor's creature to be cast out of and with that makes the symbolism closer and more accurate. People should care because word choice and the sublime can truly move people and be more than just landscapes in literature. It is something that can be symbolic of our relationship with God and Mary Shelley’s novel expresses that in her novel. The use of sublime and the creature shows the responsibility of a creator and that through word choice, can move a character/reader to do things and or change their perspective on certain topics.
Speculative fiction has often been used as a tool to expose the injustices and the reality of real-world problems such as racism. How is it that fiction, particularly of the unrealistic kind, can be an effective tool to spark social change? This paper explores how fiction uses speculative settings—futuristic worlds, time travel, and dystopian societies—to reflect and critique social problems we face in the real world. It further examines non-fiction literature’s ability to ignite social justice; Martin Luther King Jr.’s “Letter from Birmingham Jail,†for example, didactically explains the injustice and racism happening in America during the late 20th century. His letter is certainly not a piece of speculative fiction, but its non-fiction approach to systemic racism provides the background that my analysis of speculative fiction will build on. The genre of speculative fiction, including science fiction and fantasy, has cryptically described racism and other social problems through the metaphor of “the other,†exposing racialized identity and marginalization. Ultimately, science fiction not only mirrors the realities of racism but also reimagines possibilities for resistance, empathy, and transformation in more equitable futures. In this essay, I will also examine real-world changes and effects from science fiction literature using the case study of Kindred by Octavia Butler as well as non-fiction literature. I will be analyzing how Kindred has exposed problems in society and can bring about social change and awareness. In particular, I will discuss how science fiction has often been a tool or medium for talking about taboo things before they were largely socially accepted. My paper will also explain some of the drawbacks of speculative fiction as a tool for social change, and why as a genre it has not always been taken seriously. In conclusion, my essay will approach the idea that speculative fiction can create social change and work together with non-fiction pieces to expose hard truths about society through relatable characters, interesting plots, and emotions.
It has been said that history doesn’t repeat itself, but it often rhymes. This project investigates the rhetorical echoes among political leaders who, after losing power, return to reshape their nations under new circumstances. In a world where figures like Luiz Inácio Lula da Silva in Brazil and Donald Trump in the United States have reclaimed the presidency after time away, such comebacks remind us that history’s cycles of return and redemption are not unique to our era. At the heart of this study is the political revival of Getúlio Vargas, who led Brazil from 1930 to 1945 and returned through democratic election in 1950, serving until 1954. I will transcribe and organize roughly 2,500 pages of campaign speeches from Vargas’s 1950 campaign, many of which exist only as scanned documents that have not yet been processed with optical character recognition (OCR). These primary materials, gathered from the Fundação Getúlio Vargas (CPDOC) and the Hemeroteca Digital Brasileira, will form the first standardized and searchable corpus of Vargas’s late political discourse in modern Portuguese orthography. Building on this corpus, my analysis will then compare Vargas’s rhetoric with that of Juan Domingo Perón, president of Argentina from 1946 to 1955 and again from 1973 to 1974. Both leaders, returning from political exile, relied on emotionally charged appeals to unity and national rebirth to restore legitimacy and consolidate moral authority. Using distant reading (as conceptualized by Franco Moretti) alongside close comparative textual analysis, I will examine how their populist discourse fused promises of economic modernization with narratives of moral renewal, positioning themselves as indispensable mediators between “the people†and the state. Finally, I connect these mid-twentieth-century populist comebacks to broader patterns in modern politics, where rhetoric of betrayal, redemption, and destiny continues to resurface. By tracing these linguistic and symbolic strategies across time and geography, this project demonstrates how the mythology of the returning redeemer—embodied by Vargas and Perón—remains powerful in twenty-first century international politics.