Search results for: Michael Huber
16 Glocalization of Journalism and Mass Communication Education: Best Practices from an International Collaboration on Curriculum Development
Authors: Bellarmine Ezumah, Michael Mawa
Abstract:
Glocalization is often defined as the practice of conducting business according to both local and global considerations – this epitomizes the curriculum co-development collaboration between a journalism and mass communications professor from a university in the United States and the Uganda Martyrs University in Uganda where a brand new journalism and mass communications program was recently co-developed. This paper presents the experiences and research result of this initiative which was funded through the Institute of International Education (IIE) under the umbrella of the Carnegie African Diaspora Fellowship Program (CADFP). Vital international and national concerns were addressed. On a global level, scholars have questioned and criticized the general Western-module ingrained in journalism and mass communication curriculum and proposed a decolonization of journalism curricula. Another major criticism is the concept of western-based educators transplanting their curriculum verbatim to other regions of the world without paying greater attention to the local needs. To address these two global concerns, an extensive assessment of local needs was conducted prior to the conceptualization of the new program. The assessment of needs adopted a participatory action model and captured the knowledge and narratives of both internal and external stakeholders. This involved review of pertinent documents including the nation’s constitution, governmental briefs, and promulgations, interviews with governmental officials, media and journalism educators, media practitioners, students, and benchmarking the curriculum of other tertiary institutions in the nation. Information gathered through this process served as blueprint and frame of reference for all design decisions. In the area of local needs, four key factors were addressed. First, the realization that most media personnel in Uganda are both academically and professionally unqualified. Second, the practitioners with academic training were found lacking in experience. Third, the current curricula offered at several tertiary institutions are not comprehensive and lack local relevance. The project addressed these problems thus: first, the program was designed to cater to both traditional and non-traditional students offering opportunities for unqualified media practitioners to get their formal training through evening and weekender programs. Secondly, the challenge of inexperienced graduates was mitigated by designing the program to adopt the experiential learning approach which many refer to as the ‘Teaching Hospital Model’. This entails integrating practice to theory - similar to the way medical students engage in hands-on practice under the supervision of a mentor. The university drew a Memorandum of Understanding (MoU) with reputable media houses for students and faculty to use their studios for hands-on experience and for seasoned media practitioners to guest-teach some courses. With the convergence functions of media industry today, graduates should be trained to have adequate knowledge of other disciplines; therefore, the curriculum integrated cognate courses that would render graduates versatile. Ultimately, this research serves as a template for African colleges and universities to follow in their quest to glocalize their curricula. While the general concept of journalism may remain western, journalism curriculum developers in Africa through extensive assessment of needs, and focusing on those needs and other societal particularities, can adjust the western module to fit their local needs.Keywords: curriculum co-development, glocalization of journalism education, international journalism, needs assessment
Procedia PDF Downloads 12915 Older Consumer’s Willingness to Trust Social Media Advertising: A Case of Australian Social Media Users
Authors: Simon J. Wilde, David M. Herold, Michael J. Bryant
Abstract:
Social media networks have become the hotbed for advertising activities due mainly to their increasing consumer/user base and, secondly, owing to the ability of marketers to accurately measure ad exposure and consumer-based insights on such networks. More than half of the world’s population (4.8 billion) now uses social media (60%), with 150 million new users having come online within the last 12 months (to June 2022). As the use of social media networks by users grows, key business strategies used for interacting with these potential customers have matured, especially social media advertising. Unlike other traditional media outlets, social media advertising is highly interactive and digital channel specific. Social media advertisements are clearly targetable, providing marketers with an extremely powerful marketing tool. Yet despite the measurable benefits afforded to businesses engaged in social media advertising, recent controversies (such as the relationship between Facebook and Cambridge Analytica in 2018) have only heightened the role trust and privacy play within these social media networks. Using a web-based quantitative survey instrument, survey participants were recruited via a reputable online panel survey site. Respondents to the survey represented social media users from all states and territories within Australia. Completed responses were received from a total of 258 social media users. Survey respondents represented all core age demographic groupings, including Gen Z/Millennials (18-45 years = 60.5% of respondents) and Gen X/Boomers (46-66+ years = 39.5% of respondents). An adapted ADTRUST scale, using a 20 item 7-point Likert scale, measured trust in social media advertising. The ADTRUST scale has been shown to be a valid measure of trust in advertising within traditional media, such as broadcast media and print media, and, more recently, the Internet (as a broader platform). The adapted scale was validated through exploratory factor analysis (EFA), resulting in a three-factor solution. These three factors were named reliability, usefulness and affect, and the willingness to rely on. Factor scores (weighted measures) were then calculated for these factors. Factor scores are estimates of the scores survey participants would have received on each of the factors had they been measured directly, with the following results recorded (Reliability = 4.68/7; Usefulness and Affect = 4.53/7; and Willingness to Rely On = 3.94/7). Further statistical analysis (independent samples t-test) determined the difference in factor scores between the factors when age (Gen Z/Millennials vs. Gen X/Boomers) was utilized as the independent, categorical variable. The results showed the difference in mean scores across all three factors to be statistically significant (p<0.05) for these two core age groupings: (1) Gen Z/Millennials Reliability = 4.90/7 vs. Gen X/Boomers Reliability = 4.34/7; (2) Gen Z/Millennials Usefulness and Affect = 4.85/7 vs Gen X/Boomers Usefulness and Affect = 4.05/7; and (3) Gen Z/Millennials Willingness to Rely On = 4.53/7 vs Gen X/Boomers Willingness to Rely On = 3.03/7. The results clearly indicate that older social media users lack trust in the quality of information conveyed in social media ads when compared to younger, more social media-savvy consumers. This is especially evident with respect to Factor 3 (Willingness to Rely On), whose underlying variables reflect one’s behavioral intent to act based on the information conveyed in advertising. These findings can be useful to marketers, advertisers, and brand managers in that the results highlight a critical need to design ‘authentic’ advertisements on social media sites to better connect with these older users in an attempt to foster positive behavioral responses from within this large demographic group – whose engagement with social media sites continues to increase year on year.Keywords: social media advertising, trust, older consumers, internet studies
Procedia PDF Downloads 4014 Bringing Together Student Collaboration and Research Opportunities to Promote Scientific Understanding and Outreach Through a Seismological Community
Authors: Michael Ray Brunt
Abstract:
China has been the site of some of the most significant earthquakes in history; however, earthquake monitoring has long been the provenance of universities and research institutions. The China Digital Seismographic Network was initiated in 1983 and improved significantly during 1992-1993. Data from the CDSN is widely used by government and research institutions, and, generally, this data is not readily accessible to middle and high school students. An educational seismic network in China is needed to provide collaboration and research opportunities for students and engaging students around the country in scientific understanding of earthquake hazards and risks while promoting community awareness. In 2022, the Tsinghua International School (THIS) Seismology Team, made up of enthusiastic students and facilitated by two experienced teachers, was established. As a group, the team’s objective is to install seismographs in schools throughout China, thus creating an educational seismic network that shares data from the THIS Educational Seismic Network (THIS-ESN) and facilitates collaboration. The THIS-ESN initiative will enhance education and outreach in China about earthquake risks and hazards, introduce seismology to a wider audience, stimulate interest in research among students, and develop students’ programming, data collection and analysis skills. It will also encourage and inspire young minds to pursue science, technology, engineering, the arts, and math (STEAM) career fields. The THIS-ESN utilizes small, low-cost RaspberryShake seismographs as a powerful tool linked into a global network, giving schools and the public access to real-time seismic data from across China, increasing earthquake monitoring capabilities in the perspective areas and adding to the available data sets regionally and worldwide helping create a denser seismic network. The RaspberryShake seismograph is compatible with free seismic data viewing platforms such as SWARM, RaspberryShake web programs and mobile apps are designed specifically towards teaching seismology and seismic data interpretation, providing opportunities to enhance understanding. The RaspberryShake is powered by an operating system embedded in the Raspberry Pi, which makes it an easy platform to teach students basic computer communication concepts by utilizing processing tools to investigate, plot, and manipulate data. THIS Seismology Team believes strongly in creating opportunities for committed students to become part of the seismological community by engaging in analysis of real-time scientific data with tangible outcomes. Students will feel proud of the important work they are doing to understand the world around them and become advocates spreading their knowledge back into their homes and communities, helping to improve overall community resilience. We trust that, in studying the results seismograph stations yield, students will not only grasp how subjects like physics and computer science apply in real life, and by spreading information, we hope students across the country can appreciate how and why earthquakes bear on their lives, develop practical skills in STEAM, and engage in the global seismic monitoring effort. By providing such an opportunity to schools across the country, we are confident that we will be an agent of change for society.Keywords: collaboration, outreach, education, seismology, earthquakes, public awareness, research opportunities
Procedia PDF Downloads 7213 CLOUD Japan: Prospective Multi-Hospital Study to Determine the Population-Based Incidence of Hospitalized Clostridium difficile Infections
Authors: Kazuhiro Tateda, Elisa Gonzalez, Shuhei Ito, Kirstin Heinrich, Kevin Sweetland, Pingping Zhang, Catia Ferreira, Michael Pride, Jennifer Moisi, Sharon Gray, Bennett Lee, Fred Angulo
Abstract:
Clostridium difficile (C. difficile) is the most common cause of antibiotic-associated diarrhea and infectious diarrhea in healthcare settings. Japan has an aging population; the elderly are at increased risk of hospitalization, antibiotic use, and C. difficile infection (CDI). Little is known about the population-based incidence and disease burden of CDI in Japan although limited hospital-based studies have reported a lower incidence than the United States. To understand CDI disease burden in Japan, CLOUD (Clostridium difficile Infection Burden of Disease in Adults in Japan) was developed. CLOUD will derive population-based incidence estimates of the number of CDI cases per 100,000 population per year in Ota-ku (population 723,341), one of the districts in Tokyo, Japan. CLOUD will include approximately 14 of the 28 Ota-ku hospitals including Toho University Hospital, which is a 1,000 bed tertiary care teaching hospital. During the 12-month patient enrollment period, which is scheduled to begin in November 2018, Ota-ku residents > 50 years of age who are hospitalized at a participating hospital with diarrhea ( > 3 unformed stools (Bristol Stool Chart 5-7) in 24 hours) will be actively ascertained, consented, and enrolled by study surveillance staff. A stool specimen will be collected from enrolled patients and tested at a local reference laboratory (LSI Medience, Tokyo) using QUIK CHEK COMPLETE® (Abbott Laboratories). which simultaneously tests specimens for the presence of glutamate dehydrogenase (GDH) and C. difficile toxins A and B. A frozen stool specimen will also be sent to the Pfizer Laboratory (Pearl River, United States) for analysis using a two-step diagnostic testing algorithm that is based on detection of C. difficile strains/spores harboring toxin B gene by PCR followed by detection of free toxins (A and B) using a proprietary cell cytotoxicity neutralization assay (CCNA) developed by Pfizer. Positive specimens will be anaerobically cultured, and C. difficile isolates will be characterized by ribotyping and whole genomic sequencing. CDI patients enrolled in CLOUD will be contacted weekly for 90 days following diarrhea onset to describe clinical outcomes including recurrence, reinfection, and mortality, and patient reported economic, clinical and humanistic outcomes (e.g., health-related quality of life, worsening of comorbidities, and patient and caregiver work absenteeism). Studies will also be undertaken to fully characterize the catchment area to enable population-based estimates. The 12-month active ascertainment of CDI cases among hospitalized Ota-ku residents with diarrhea in CLOUD, and the characterization of the Ota-ku catchment area, including estimation of the proportion of all hospitalizations of Ota-ku residents that occur in the CLOUD-participating hospitals, will yield CDI population-based incidence estimates, which can be stratified by age groups, risk groups, and source (hospital-acquired or community-acquired). These incidence estimates will be extrapolated, following age standardization using national census data, to yield CDI disease burden estimates for Japan. CLOUD also serves as a model for studies in other countries that can use the CLOUD protocol to estimate CDI disease burden.Keywords: Clostridium difficile, disease burden, epidemiology, study protocol
Procedia PDF Downloads 26112 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts
Authors: William Michael Short
Abstract:
Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics
Procedia PDF Downloads 13211 Increasing Student Engagement through Culturally-Responsive Classroom Management
Authors: Catherine P. Bradshaw, Elise T. Pas, Katrina J. Debnam, Jessika H. Bottiani, Michael Rosenberg
Abstract:
Worldwide, ethnically and culturally diverse students are at increased risk for school failure, discipline problems, and dropout. Despite decades of concern about this issue of disparities in education and other fields (e.g., 'school to prison pipeline'), there has been limited empirical examination of models that can actually reduce these gaps in schools. Moreover, few studies have examined the effectiveness of in-service teacher interventions and supports specifically designed to reduce discipline disparities and improve student engagement. This session provides an overview of the evidence-based Double Check model which serves as a framework for teachers to use culturally-responsive strategies to engage ethnically and culturally diverse students in the classroom and reduce discipline problems. Specifically, Double Check is a school-based prevention program which includes three core components: (a) enhancements to the school-wide Positive Behavioral Interventions and Supports (PBIS) tier-1 level of support; (b) five one-hour professional development training sessions, each of which addresses five domains of cultural competence (i.e., connection to the curriculum, authentic relationships, reflective thinking, effective communication, and sensitivity to students’ culture); and (c) coaching of classroom teachers using an adapted version of the Classroom Check-Up, which intends to increase teachers’ use of effective classroom management and culturally-responsive strategies using research-based motivational interviewing and data-informed problem-solving approaches. This paper presents findings from a randomized controlled trial (RCT) testing the impact of Double Check, on office discipline referrals (disaggregated by race) and independently observed and self-reported culturally-responsive practices and classroom behavior management. The RCT included 12 elementary and middle schools; 159 classroom teachers were randomized either to receive coaching or serve as comparisons. Specifically, multilevel analyses indicated that teacher self-reported culturally responsive behavior management improved over the course of the school year for teachers who received the coaching and professional development. However, the average annual office discipline referrals issued to black students were reduced among teachers who were randomly assigned to receive coaching relative to comparison teachers. Similarly, observations conducted by trained external raters indicated significantly more teacher proactive behavior management and anticipation of student problems, higher student compliance, less student non-compliance, and less socially disruptive behaviors in classrooms led by coached teachers than classrooms led teachers randomly assigned to the non-coached condition. These findings indicated promising effects of the Double Check model on a range of teacher and student outcomes, including disproportionality in office discipline referrals among Black students. These results also suggest that the Double Check model is one of only a few systematic approaches to promoting culturally-responsive behavior management which has been rigorously tested and shown to be associated with improvements in either student or staff outcomes indicated significant reductions in discipline problems and improvements in behavior management. Implications of these findings are considered within the broader context of globalization and demographic shifts, and their impacts on schools. These issues are particularly timely, given growing concerns about immigration policies in the U.S. and abroad.Keywords: ethnically and culturally diverse students, student engagement, school-based prevention, academic achievement
Procedia PDF Downloads 28210 Surface Sunctionalization Strategies for the Design of Thermoplastic Microfluidic Devices for New Analytical Diagnostics
Authors: Camille Perréard, Yoann Ladner, Fanny D'Orlyé, Stéphanie Descroix, Vélan Taniga, Anne Varenne, Cédric Guyon, Michael. Tatoulian, Frédéric Kanoufi, Cyrine Slim, Sophie Griveau, Fethi Bedioui
Abstract:
The development of micro total analysis systems is of major interest for contaminant and biomarker analysis. As a lab-on-chip integrates all steps of an analysis procedure in a single device, analysis can be performed in an automated format with reduced time and cost, while maintaining performances comparable to those of conventional chromatographic systems. Moreover, these miniaturized systems are either compatible with field work or glovebox manipulations. This work is aimed at developing an analytical microsystem for trace and ultra trace quantitation in complex matrices. The strategy consists in the integration of a sample pretreatment step within the lab-on-chip by a confinement zone where selective ligands are immobilized for target extraction and preconcentration. Aptamers were chosen as selective ligands, because of their high affinity for all types of targets (from small ions to viruses and cells) and their ease of synthesis and functionalization. This integrated target extraction and concentration step will be followed in the microdevice by an electrokinetic separation step and an on-line detection. Polymers consisting of cyclic olefin copolymer (COC) or fluoropolymer (Dyneon THV) were selected as they are easy to mold, transparent in UV-visible and have high resistance towards solvents and extreme pH conditions. However, because of their low chemical reactivity, surface treatments are necessary. For the design of this miniaturized diagnostics, we aimed at modifying the microfluidic system at two scales : (1) on the entire surface of the microsystem to control the surface hydrophobicity (so as to avoid any sample wall adsorption) and the fluid flows during electrokinetic separation, or (2) locally so as to immobilize selective ligands (aptamers) on restricted areas for target extraction and preconcentration. We developed different novel strategies for the surface functionalization of COC and Dyneon, based on plasma, chemical and /or electrochemical approaches. In a first approach, a plasma-induced immobilization of brominated derivatives was performed on the entire surface. Further substitution of the bromine by an azide functional group led to covalent immobilization of ligands through “click” chemistry reaction between azides and terminal alkynes. COC and Dyneon materials were characterized at each step of the surface functionalization procedure by various complementary techniques to evaluate the quality and homogeneity of the functionalization (contact angle, XPS, ATR). With the objective of local (micrometric scale) aptamer immobilization, we developed an original electrochemical strategy on engraved Dyneon THV microchannel. Through local electrochemical carbonization followed by adsorption of azide-bearing diazonium moieties and covalent linkage of alkyne-bearing aptamers through click chemistry reaction, typical dimensions of immobilization zones reached the 50 µm range. Other functionalization strategies, such as sol-gel encapsulation of aptamers, are currently investigated and may also be suitable for the development of the analytical microdevice. The development of these functionalization strategies is the first crucial step in the design of the entire microdevice. These strategies allow the grafting of a large number of molecules for the development of new analytical tools in various domains like environment or healthcare.Keywords: alkyne-azide click chemistry (CuAAC), electrochemical modification, microsystem, plasma bromination, surface functionalization, thermoplastic polymers
Procedia PDF Downloads 4429 Removing Maturational Influences from Female Youth Swimming: The Application of Corrective Adjustment Procedures
Authors: Clorinda Hogan, Shaun Abbott, Mark Halaki, Marcela Torres Catiglioni, Goshi Yamauchi, Lachlan Mitchell, James Salter, Michael Romann, Stephen Cobley
Abstract:
Introduction: Common annual age-group competition structures unintentionally introduce participation inequalities, performance (dis)advantages and selection biases due to the effect of maturational variation between youth swimmers. On this basis, there are implications for improving performance evaluation strategies. Therefore the aim was to: (1) To determine maturity timing distributions in female youth swimming; (2) quantify the relationship between maturation status and 100-m FC performance; (3) apply Maturational-based Corrective Adjustment Procedures (Mat-CAPs) for removal of maturational status performance influences. Methods: (1) Cross-sectional analysis of 663 female (10-15 years) swimmers who underwent assessment of anthropometrics (mass, height and sitting height) and estimations of maturity timing and offset. (2) 100-m front-crawl performance (seconds) was assessed at Australian regional, state, and national-level competitions between 2016-2020. To determine the relationship between maturation status and 100-m front-crawl performance, MO was plotted against 100-m FC performance time. The expected maturity status - performance relationship for females aged 10-15 years of age was obtained through a quadratic function (y = ax2 + bx + c) from unstandardized coefficients. The regression equation was subsequently used for Mat-CAPs. (3) Participants aged 10-13 years were categorised into maturity-offset categories. Maturity offset distributions for Raw (‘All’, ‘Top 50%’ & ‘Top 25%’) and Correctively Adjusted swim times were examined. Chi-square, Cramer’s V and ORs determined the occurrence of maturation biases for each age group and selection level. Results—: (1) Maturity timing distributions illustrated overrepresentation of ‘normative’ maturing swimmers (11.82 ± 0.40 years), with a descriptive shift toward the early maturing relative to the normative population. (2) A curvilinear relationship between maturity-offset and swim performance was identified (R2 = 0.53, P < 0.001) and subsequently utilised for Mat-CAPs. (3) Raw maturity offset categories identified partial maturation status skewing towards biologically older swimmers at 10/11 and 12 years, with effect magnitudes increasing in the ‘Top 50%’ and ‘25%’ of performance times. Following Mat-CAPs application, maturity offset biases were removed in similar age groups and selection levels. When adjusting performance times for maturity offset, Mat-CAPs was successful in mitigating against maturational biases until approximately 1-year post Peak Height Velocity. The overrepresentation of ‘normative’ maturing female swimmers contrasted with the substantial overrepresentation of ‘early’ maturing male swimmers found previously in 100-m front-crawl. These findings suggest early maturational timing is not advantageous in females, but findings associated with Aim 2, highlight how advanced maturational status remained beneficial to performance. Observed differences between female and male maturational biases may relate to the differential impact of physiological development during pubertal years. Females experience greater increases of fat mass and potentially differing changes in body shape which can negatively affect swim performance. Conclusions: Transient maturation status-based participation and performance advantages were apparent within a large sample of Australian female youth 100-m FC swimmers. By removing maturity status performance biases within female youth swimming, Mat-CAPs could help improve participation experiences and the accuracy of identifying genuinely skilled female youth swimmers.Keywords: athlete development, long-term sport participation, performance evaluation, talent identification, youth competition
Procedia PDF Downloads 1838 Low-Cost Aviation Solutions to Strengthen Counter-Poaching Efforts in Kenya
Authors: Kuldeep Rawat, Michael O'Shea, Maureen McGough
Abstract:
The paper will discuss a National Institute of Justice (NIJ) funded project to provide cost-effective aviation technologies and research to support counter-poaching operations related to endangered, protected, and/or regulated wildlife. The goal of this project is to provide cost-effective aviation technology and research support to Kenya Wildlife Service (KWS) in their counter-poaching efforts. In pursuit of this goal, Elizabeth City State University (ECSU) is assisting the National Institute of Justice (NIJ) in enhancing the Kenya Wildlife Service’s aviation technology and related capacity to meet its counter-poaching mission. Poaching, at its core, is systemic as poachers go to the most extreme lengths to kill high target species such as elephant and rhino. These high target wildlife species live in underdeveloped or impoverished nations, where poachers find fewer barriers to their operations. In Kenya, with fifty-nine (59) parks and reserves, spread over an area of 225,830 square miles (584,897 square kilometers) adequate surveillance on the ground is next to impossible. Cost-effective aviation surveillance technologies, based on a comprehensive needs assessment and operational evaluation, are needed to curb poaching and effectively prevent wildlife trafficking. As one of the premier law enforcement Air Wings in East Africa, KWS plays a crucial role in Kenya, not only in counter-poaching and wildlife conservation efforts, but in aerial surveillance, counterterrorism and national security efforts as well. While the Air Wing has done, a remarkable job conducting aerial patrols with limited resources, additional aircraft and upgraded technology should significantly advance the Air Wing’s ability to achieve its wildlife protection mission. The project includes: (i) Needs Assessment of the KWS Air Wing, to include the identification of resources, current and prospective capacity, operational challenges and priority goals for expansion, (ii) Acquisition of Low-Cost Aviation Technology to meet priority needs, and (iii) Operational Evaluation of technology performance, with a focus on implementation and effectiveness. The Needs Assessment reflects the priorities identified through two site visits to the KWS Air Wing in Nairobi, Kenya, as well as field visits to multiple national parks receiving aerial support and interviewing/surveying KWS Air wing pilots and leadership. Needs Assessment identified some immediate technology needs that includes, GPS with upgrades, including weather application, Night flying capabilities, to include runway lights and night vision technology, Cameras and surveillance equipment, Flight tracking system and/or Emergency Position Indicating Radio Beacon, Lightweight ballistic-resistant body armor, and medical equipment, to include a customized stretcher and standard medical evacuation equipment. Results of this assessment, along with significant input from the KWS Air Wing, will guide the second phase of this project: technology acquisition. Acquired technology will then be evaluated in the field, with a focus on implementation and effectiveness. Results will ultimately be translated for any rural or tribal law enforcement agencies with comparable aerial surveillance missions and operational environments, and jurisdictional challenges, seeking to implement low-cost aviation technology. Results from Needs Assessment phase, including survey results and our ongoing technology acquisition and baseline operational evaluation will be discussed in the paper.Keywords: aerial surveillance mission, aviation technology, counter-poaching, wildlife protection
Procedia PDF Downloads 2757 Stabilizing Additively Manufactured Superalloys at High Temperatures
Authors: Keivan Davami, Michael Munther, Lloyd Hackel
Abstract:
The control of properties and material behavior by implementing thermal-mechanical processes is based on mechanical deformation and annealing according to a precise schedule that will produce a unique and stable combination of grain structure, dislocation substructure, texture, and dispersion of precipitated phases. The authors recently developed a thermal-mechanical technique to stabilize the microstructure of additively manufactured nickel-based superalloys even after exposure to high temperatures. However, the mechanism(s) that controls this stability is still under investigation. Laser peening (LP), also called laser shock peening (LSP), is a shock based (50 ns duration) post-processing technique used for extending performance levels and improving service life of critical components by developing deep levels of plastic deformation, thereby generating high density of dislocations and inducing compressive residual stresses in the surface and deep subsurface of components. These compressive residual stresses are usually accompanied with an increase in hardness and enhance the material’s resistance to surface-related failures such as creep, fatigue, contact damage, and stress corrosion cracking. While the LP process enhances the life span and durability of the material, the induced compressive residual stresses relax at high temperatures (>0.5Tm, where Tm is the absolute melting temperature), limiting the applicability of the technology. At temperatures above 0.5Tm, the compressive residual stresses relax, and yield strength begins to drop dramatically. The principal reason is the increasing rate of solid-state diffusion, which affects both the dislocations and the microstructural barriers. Dislocation configurations commonly recover by mechanisms such as climbing and recombining rapidly at high temperatures. Furthermore, precipitates coarsen, and grains grow; virtually all of the available microstructural barriers become ineffective.Our results indicate that by using “cyclic” treatments with sequential LP and annealing steps, the compressive stresses survive, and the microstructure is stable after exposure to temperatures exceeding 0.5Tm for a long period of time. When the laser peening process is combined with annealing, dislocations formed as a result of LPand precipitates formed during annealing have a complex interaction that provides further stability at high temperatures. From a scientific point of view, this research lays the groundwork for studying a variety of physical, materials science, and mechanical engineering concepts. This research could lead to metals operating at higher sustained temperatures enabling improved system efficiencies. The strengthening of metals by a variety of means (alloying, work hardening, and other processes) has been of interest for a wide range of applications. However, the mechanistic understanding of the often complex processes of interactionsbetween dislocations with solute atoms and with precipitates during plastic deformation have largely remained scattered in the literature. In this research, the elucidation of the actual mechanisms involved in the novel cyclic LP/annealing processes as a scientific pursuit is investigated through parallel studies of dislocation theory and the implementation of advanced experimental tools. The results of this research help with the validation of a novel laser processing technique for high temperature applications. This will greatly expand the applications of the laser peening technology originally devised only for temperatures lower than half of the melting temperature.Keywords: laser shock peening, mechanical properties, indentation, high temperature stability
Procedia PDF Downloads 1496 Older Consumer’s Willingness to Trust Social Media Advertising: An Australian Case
Authors: Simon J. Wilde, David M. Herold, Michael J. Bryant
Abstract:
Social media networks have become the hotbed for advertising activities, due mainly to their increasing consumer/user base, and secondly, owing to the ability of marketers to accurately measure ad exposure and consumer-based insights on such networks. More than half of the world’s population (4.8 billion) now uses social media (60%), with 150 million new users having come online within the last 12 months (to June 2022). As the use of social media networks by users grows, key business strategies used for interacting with these potential customers have matured, especially social media advertising. Unlike other traditional media outlets, social media advertising is highly interactive and digital channel-specific. Social media advertisements are clearly targetable, providing marketers with an extremely powerful marketing tool. Yet despite the measurable benefits afforded to businesses engaged in social media advertising, recent controversies (such as the relationship between Facebook and Cambridge Analytica in 2018) have only heightened the role trust and privacy play within these social media networks. The purpose of this exploratory paper is to investigate the extent to which social media users trust social media advertising. Understanding this relationship will fundamentally assist marketers in better understanding social media interactions and their implications for society. Using a web-based quantitative survey instrument, survey participants were recruited via a reputable online panel survey site. Respondents to the survey represented social media users from all states and territories within Australia. Completed responses were received from a total of 258 social media users. Survey respondents represented all core age demographic groupings, including Gen Z/Millennials (18-45 years = 60.5% of respondents) and Gen X/Boomers (46-66+ years = 39.5% of respondents). An adapted ADTRUST scale, using a 20 item 7-point Likert scale, measured trust in social media advertising. The ADTRUST scale has been shown to be a valid measure of trust in advertising within traditional different media, such as broadcast media and print media, and more recently, the Internet (as a broader platform). The adapted scale was validated through exploratory factor analysis (EFA), resulting in a three-factor solution. These three factors were named reliability, usefulness and affect, and the willingness to rely on. Factor scores (weighted measures) were then calculated for these factors. Factor scores are estimates of the scores survey participants would have received on each of the factors had they been measured directly, with the following results recorded (Reliability = 4.68/7; Usefulness and Affect = 4.53/7; and Willingness to Rely On = 3.94/7). Further statistical analysis (independent samples t-test) determined the difference in factor scores between the factors when age (Gen Z/Millennials vs. Gen X/Boomers) was utilised as the independent, categorical variable. The results showed the difference in mean scores across all three factors to be statistically significant (p<0.05) for these two core age groupings: Gen Z/Millennials Reliability = 4.90/7 vs Gen X/Boomers Reliability = 4.34/7; Gen Z/Millennials Usefulness and Affect = 4.85/7 vs Gen X/Boomers Usefulness and Affect = 4.05/7; and Gen Z/Millennials Willingness to Rely On = 4.53/7 vs Gen X/Boomers Willingness to Rely On = 3.03/7. The results clearly indicate that older social media users lack trust in the quality of information conveyed in social media ads, when compared to younger, more social media-savvy consumers. This is especially evident with respect to Factor 3 (Willingness to Rely On), whose underlying variables reflect one’s behavioural intent to act based on the information conveyed in advertising. These findings can be useful to marketers, advertisers, and brand managers in that the results highlight a critical need to design ‘authentic’ advertisements on social media sites to better connect with these older users, in an attempt to foster positive behavioural responses from within this large demographic group – whose engagement with social media sites continues to increase year on year.Keywords: social media advertising, trust, older consumers, online
Procedia PDF Downloads 815 Investigation on Pull-Out-Behavior and Interface Critical Parameters of Polymeric Fibers Embedded in Concrete and Their Correlation with Particular Fiber Characteristics
Authors: Michael Sigruener, Dirk Muscat, Nicole Struebbe
Abstract:
Fiber reinforcement is a state of the art to enhance mechanical properties in plastics. For concrete and civil engineering, steel reinforcements are commonly used. Steel reinforcements show disadvantages in their chemical resistance and weight, whereas polymer fibers' major problems are in fiber-matrix adhesion and mechanical properties. In spite of these facts, longevity and easy handling, as well as chemical resistance motivate researches to develop a polymeric material for fiber reinforced concrete. Adhesion and interfacial mechanism in fiber-polymer-composites are already studied thoroughly. For polymer fibers used as concrete reinforcement, the bonding behavior still requires a deeper investigation. Therefore, several differing polymers (e.g., polypropylene (PP), polyamide 6 (PA6) and polyetheretherketone (PEEK)) were spun into fibers via single screw extrusion and monoaxial stretching. Fibers then were embedded in a concrete matrix, and Single-Fiber-Pull-Out-Tests (SFPT) were conducted to investigate bonding characteristics and microstructural interface of the composite. Differences in maximum pull-out-force, displacement and slope of the linear part of force vs displacement-function, which depicts the adhesion strength and the ductility of the interfacial bond were studied. In SFPT fiber, debonding is an inhomogeneous process, where the combination of interfacial bonding and friction mechanisms add up to a resulting value. Therefore, correlations between polymeric properties and pull-out-mechanisms have to be emphasized. To investigate these correlations, all fibers were introduced to a series of analysis such as differential scanning calorimetry (DSC), contact angle measurement, surface roughness and hardness analysis, tensile testing and scanning electron microscope (SEM). Of each polymer, smooth and abraded fibers were tested, first to simulate the abrasion and damage caused by a concrete mixing process and secondly to estimate the influence of mechanical anchoring of rough surfaces. In general, abraded fibers showed a significant increase in maximum pull-out-force due to better mechanical anchoring. Friction processes therefore play a major role to increase the maximum pull-out-force. The polymer hardness affects the tribological behavior and polymers with high hardness lead to lower surface roughness verified by SEM and surface roughness measurements. This concludes into a decreased maximum pull-out-force for hard polymers. High surface energy polymers show better interfacial bonding strength in general, which coincides with the conducted SFPT investigation. Polymers such as PEEK or PA6 show higher bonding strength in smooth and roughened fibers, revealed through high pull-out-force and concrete particles bonded on the fiber surface pictured via SEM analysis. The surface energy divides into dispersive and polar part, at which the slope is correlating with the polar part. Only polar polymers increase their SFPT-function slope due to better wetting abilities when showing a higher bonding area through rough surfaces. Hence, the maximum force and the bonding strength of an embedded fiber is a function of polarity, hardness, and consequently surface roughness. Other properties such as crystallinity or tensile strength do not affect bonding behavior. Through the conducted analysis, it is now feasible to understand and resolve different effects in pull-out-behavior step-by-step based on the polymer properties itself. This investigation developed a roadmap on how to engineer high adhering polymeric materials for fiber reinforcement of concrete.Keywords: fiber-matrix interface, polymeric fibers, fiber reinforced concrete, single fiber pull-out test
Procedia PDF Downloads 1134 Radioprotective Effects of Super-Paramagnetic Iron Oxide Nanoparticles Used as Magnetic Resonance Imaging Contrast Agent for Magnetic Resonance Imaging-Guided Radiotherapy
Authors: Michael R. Shurin, Galina Shurin, Vladimir A. Kirichenko
Abstract:
Background. Visibility of hepatic malignancies is poor on non-contrast imaging for daily verification of liver malignancies prior to radiation therapy on MRI-guided Linear Accelerators (MR-Linac). Ferumoxytol® (Feraheme, AMAG Pharmaceuticals, Waltham, MA) is a SPION agent that is increasingly utilized off-label as hepatic MRI contrast. This agent has the advantage of providing a functional assessment of the liver based upon its uptake by hepatic Kupffer cells proportionate to vascular perfusion, resulting in strong T1, T2 and T2* relaxation effects and enhanced contrast of malignant tumors, which lack Kupffer cells. The latter characteristic has been recently utilized for MRI-guided radiotherapy planning with precision targeting of liver malignancies. However potential radiotoxicity of SPION has never been addressed for its safe use as an MRI-contrast agent during liver radiotherapy on MRI-Linac. This study defines the radiomodulating properties of SPIONs in vitro on human monocyte and macrophage cell lines exposed to 60Go gamma-rays within clinical radiotherapy dose range. Methods. Human monocyte and macrophages cell line in cultures were loaded with a clinically relevant concentration of Ferumoxytol (30µg/ml) for 2 and 24 h and irradiated to 3Gy, 5Gy and 10Gy. Cells were washed and cultured for additional 24 and 48 h prior to assessing their phenotypic activation by flow cytometry and function, including viability (Annexin V/PI assay), proliferation (MTT assay) and cytokine expression (Luminex assay). Results. Our results reveled that SPION affected both human monocytes and macrophages in vitro. Specifically, iron oxide nanoparticles decreased radiation-induced apoptosis and prevented radiation-induced inhibition of human monocyte proliferative activity. Furthermore, Ferumoxytol protected monocytes from radiation-induced modulation of phenotype. For instance, while irradiation decreased polarization of monocytes to CD11b+CD14+ and CD11bnegCD14neg phenotype, Ferumoxytol prevented these effects. In macrophages, Ferumoxytol counteracted the ability of radiation to up-regulate cell polarization to CD11b+CD14+ phenotype and prevented radiation-induced down-regulation of expression of HLA-DR and CD86 molecules. Finally, Ferumoxytol uptake by human monocytes down-regulated expression of pro-inflammatory chemokines MIP-1α (Macrophage inflammatory protein 1α), MIP-1β (CCL4) and RANTES (CCL5). In macrophages, Ferumoxytol reversed the expression of IL-1RA, IL-8, IP-10 (CXCL10) and TNF-α, and up-regulates expression of MCP-1 (CCL2) and MIP-1α in irradiated macrophages. Conclusion. SPION agent Ferumoxytol increases resistance of human monocytes to radiation-induced cell death in vitro and supports anti-inflammatory phenotype of human macrophages under radiation. The effect is radiation dose-dependent and depends on the duration of Feraheme uptake. This study also finds strong evidence that SPIONs reversed the effect of radiation on the expression of pro-inflammatory cytokines involved in initiation and development of radiation-induced liver damage. Correlative translational work at our institution will directly assess the cyto-protective effects of Ferumoxytol on human Kupfer cells in vitro and ex vivo analysis of explanted liver specimens in a subset of patients receiving Feraheme-enhanced MRI-guided radiotherapy to the primary liver tumors as a bridge to liver transplant.Keywords: superparamagnetic iron oxide nanoparticles, radioprotection, magnetic resonance imaging, liver
Procedia PDF Downloads 723 Identification Strategies for Unknown Victims from Mass Disasters and Unknown Perpetrators from Violent Crime or Terrorist Attacks
Authors: Michael Josef Schwerer
Abstract:
Background: The identification of unknown victims from mass disasters, violent crimes, or terrorist attacks is frequently facilitated through information from missing persons lists, portrait photos, old or recent pictures showing unique characteristics of a person such as scars or tattoos, or simply reference samples from blood relatives for DNA analysis. In contrast, the identification or at least the characterization of an unknown perpetrator from criminal or terrorist actions remains challenging, particularly in the absence of material or data for comparison, such as fingerprints, which had been previously stored in criminal records. In scenarios that result in high levels of destruction of the perpetrator’s corpse, for instance, blast or fire events, the chance for a positive identification using standard techniques is further impaired. Objectives: This study shows the forensic genetic procedures in the Legal Medicine Service of the German Air Force for the identification of unknown individuals, including such cases in which reference samples are not available. Scenarios requiring such efforts predominantly involve aircraft crash investigations, which are routinely carried out by the German Air Force Centre of Aerospace Medicine as one of the Institution’s essential missions. Further, casework by military police or military intelligence is supported based on administrative cooperation. In the talk, data from study projects, as well as examples from real casework, will be demonstrated and discussed with the audience. Methods: Forensic genetic identification in our laboratories involves the analysis of Short Tandem Repeats and Single Nucleotide Polymorphisms in nuclear DNA along with mitochondrial DNA haplotyping. Extended DNA analysis involves phenotypic markers for skin, hair, and eye color together with the investigation of a person’s biogeographic ancestry. Assessment of the biological age of an individual employs CpG-island methylation analysis using bisulfite-converted DNA. Forensic Investigative Genealogy assessment allows the detection of an unknown person’s blood relatives in reference databases. Technically, end-point-PCR, real-time PCR, capillary electrophoresis, pyrosequencing as well as next generation sequencing using flow-cell-based and chip-based systems are used. Results and Discussion: Optimization of DNA extraction from various sources, including difficult matrixes like formalin-fixed, paraffin-embedded tissues, degraded specimens from decomposed bodies or from decedents exposed to blast or fire events, provides soil for successful PCR amplification and subsequent genetic profiling. For cases with extremely low yields of extracted DNA, whole genome preamplification protocols are successfully used, particularly regarding genetic phenotyping. Improved primer design for CpG-methylation analysis, together with validated sampling strategies for the analyzed substrates from, e.g., lymphocyte-rich organs, allows successful biological age estimation even in bodies with highly degraded tissue material. Conclusions: Successful identification of unknown individuals or at least their phenotypic characterization using pigmentation markers together with age-informative methylation profiles, possibly supplemented by family tree search employing Forensic Investigative Genealogy, can be provided in specialized laboratories. However, standard laboratory procedures must be adapted to work with difficult and highly degraded sample materials.Keywords: identification, forensic genetics, phenotypic markers, CPG methylation, biological age estimation, forensic investigative genealogy
Procedia PDF Downloads 512 Bio-Electro Chemical Catalysis: Redox Interactions, Storm and Waste Water Treatment
Authors: Michael Radwan Omary
Abstract:
Context: This scientific innovation demonstrate organic catalysis engineered media effective desalination of surface and groundwater. The author has developed a technology called “Storm-Water Ions Filtration Treatment” (SWIFTTM) cold reactor modules designed to retrofit typical urban street storm drains or catch basins. SWIFT triggers biochemical redox reactions with water stream-embedded toxic total dissolved solids (TDS) and electrical conductivity (EC). SWIFTTM Catalysts media unlock the sub-molecular bond energy, break down toxic chemical bonds, and neutralize toxic molecules, bacteria and pathogens. Research Aim: This research aims to develop and design lower O&M cost, zero-brine discharge, energy input-free, chemical-free water desalination and disinfection systems. The objective is to provide an effective resilient and sustainable solution to urban storm-water and groundwater decontamination and disinfection. Methodology: We focused on the development of organic, non-chemical, no-plugs, no pumping, non-polymer and non-allergenic approaches for water and waste water desalination and disinfection. SWIFT modules operate by directing the water stream to flow freely through the electrically charged media cold reactor, generating weak interactions with a water-dissolved electrically conductive molecule, resulting in the neutralization of toxic molecules. The system is powered by harvesting sub-molecular bonds embedded in energy. Findings: The SWIFTTM Technology case studies at CSU-CI and CSU-Fresno Water Institute, demonstrated consistently high reduction of all 40 detected waste-water pollutants including pathogens to levels below a state of California Department of Water Resources “Drinking Water Maximum Contaminants Levels”. The technology has proved effective in reducing pollutants such as arsenic, beryllium, mercury, selenium, glyphosate, benzene, and E. coli bacteria. The technology has also been successfully applied to the decontamination of dissolved chemicals, water pathogens, organic compounds and radiological agents. Theoretical Importance: SWIFT technology development, design, engineering, and manufacturing, offer cutting-edge advancement in achieving clean-energy source bio-catalysis media solution, an energy input free water and waste water desalination and disinfection. A significant contribution to institutions and municipalities achieving sustainable, lower cost, zero-brine and zero CO2 discharges clean energy water desalination. Data Collection and Analysis Procedures: The researchers collected data on the performance of the SWIFTTM technology in reducing the levels of various pollutants in water. The data was analyzed by comparing the reduction achieved by the SWIFTTM technology to the Drinking Water Maximum Contaminants Levels set by the state of California. The researchers also conducted live oral presentations to showcase the applications of SWIFTTM technology in storm water capture and decontamination as well as providing clean drinking water during emergencies. Conclusion: The SWIFTTM Technology has demonstrated its capability to effectively reduce pollutants in water and waste water to levels below regulatory standards. The Technology offers a sustainable solution to groundwater and storm-water treatments. Further development and implementation of the SWIFTTM Technology have the potential to treat storm water to be reused as a new source of drinking water and an ambient source of clean and healthy local water for recharge of ground water.Keywords: catalysis, bio electro interactions, water desalination, weak-interactions
Procedia PDF Downloads 671 Location3: A Location Scouting Platform for the Support of Film and Multimedia Industries
Authors: Dimitrios Tzilopoulos, Panagiotis Symeonidis, Michael Loufakis, Dimosthenis Ioannidis, Dimitrios Tzovaras
Abstract:
The domestic film industry in Greece has traditionally relied heavily on state support. While film productions are crucial for the country's economy, it has not fully capitalized on attracting and promoting foreign productions. The lack of motivation, organized state support for attraction and licensing, and the absence of location scouting have hindered its potential. Although recent legislative changes have addressed the first two of these issues, the development of a comprehensive location database and a search engine that would effectively support location scouting at the pre-production location scouting is still in its early stages. In addition to the expected benefits of the film, television, marketing, and multimedia industries, a location-scouting service platform has the potential to yield significant financial gains locally and nationally. By promoting featured places like cultural and archaeological sites, natural monuments, and attraction points for visitors, it plays a vital role in both cultural promotion and facilitating tourism development. This study introduces LOCATION3, an internet platform revolutionizing film production location management. It interconnects location providers, film crews, and multimedia stakeholders, offering a comprehensive environment for seamless collaboration. The platform's central geodatabase (PostgreSQL) stores each location’s attributes, while web technologies like HTML, JavaScript, CSS, React.js, and Redux power the user-friendly interface. Advanced functionalities, utilizing deep learning models, developed in Python, are integrated via Node.js. Visual data presentation is achieved using the JS Leaflet library, delivering an interactive map experience. LOCATION3 sets a new standard, offering a range of essential features to enhance the management of film production locations. Firstly, it empowers users to effortlessly upload audiovisual material enriched with geospatial and temporal data, such as location coordinates, photographs, videos, 360-degree panoramas, and 3D location models. With the help of cutting-edge deep learning algorithms, the application automatically tags these materials, while users can also manually tag them. Moreover, the application allows users to record locations directly through its user-friendly mobile application. Users can then embark on seamless location searches, employing spatial or descriptive criteria. This intelligent search functionality considers a combination of relevant tags, dominant colors, architectural characteristics, emotional associations, and unique location traits. One of the application's standout features is the ability to explore locations by their visual similarity to other materials, facilitated by a reverse image search. Also, the interactive map serves as both a dynamic display for locations and a versatile filter, adapting to the user's preferences and effortlessly enhancing location searches. To further streamline the process, the application facilitates the creation of location lightboxes, enabling users to efficiently organize and share their content via email. Going above and beyond location management, the platform also provides invaluable liaison, matchmaking, and online marketplace services. This powerful functionality bridges the gap between visual and three-dimensional geospatial material providers, local agencies, film companies, production companies, etc. so that those interested in a specific location can access additional material beyond what is stored on the platform, as well as access production services supporting the functioning and completion of productions in a location (equipment provision, transportation, catering, accommodation, etc.).Keywords: deep learning models, film industry, geospatial data management, location scouting
Procedia PDF Downloads 71