Search results for: decision tree modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8300

Search results for: decision tree modeling

80 The Dynamic Nexus of Public Health and Journalism in Informed Societies

Authors: Ali Raza

Abstract:

The dynamic landscape of communication has brought about significant advancements that intersect with the realms of public health and journalism. This abstract explores the evolving synergy between these fields, highlighting how their intersection has contributed to informed societies and improved public health outcomes. In the digital age, communication plays a pivotal role in shaping public perception, policy formulation, and collective action. Public health, concerned with safeguarding and improving community well-being, relies on effective communication to disseminate information, encourage healthy behaviors, and mitigate health risks. Simultaneously, journalism, with its commitment to accurate and timely reporting, serves as the conduit through which health information reaches the masses. Advancements in communication technologies have revolutionized the ways in which public health information is both generated and shared. The advent of social media platforms, mobile applications, and online forums has democratized the dissemination of health-related news and insights. This democratization, however, brings challenges, such as the rapid spread of misinformation and the need for nuanced strategies to engage diverse audiences. Effective collaboration between public health professionals and journalists is pivotal in countering these challenges, ensuring that accurate information prevails. The synergy between public health and journalism is most evident during public health crises. The COVID-19 pandemic underscored the pivotal role of journalism in providing accurate and up-to-date information to the public. However, it also highlighted the importance of responsible reporting, as sensationalism and misinformation could exacerbate the crisis. Collaborative efforts between public health experts and journalists led to the amplification of preventive measures, the debunking of myths, and the promotion of evidence-based interventions. Moreover, the accessibility of information in the digital era necessitates a strategic approach to health communication. Behavioral economics and data analytics offer insights into human decision-making and allow tailored health messages to resonate more effectively with specific audiences. This approach, when integrated into journalism, enables the crafting of narratives that not only inform but also influence positive health behaviors. Ethical considerations emerge prominently in this alliance. The responsibility to balance the public's right to know with the potential consequences of sensational reporting underscores the significance of ethical journalism. Health journalists must meticulously source information from reputable experts and institutions to maintain credibility, thus fortifying the bridge between public health and the public. As both public health and journalism undergo transformative shifts, fostering collaboration between these domains becomes essential. Training programs that familiarize journalists with public health concepts and practices can enhance their capacity to report accurately and comprehensively on health issues. Likewise, public health professionals can gain insights into effective communication strategies from seasoned journalists, ensuring that health information reaches a wider audience. In conclusion, the convergence of public health and journalism, facilitated by communication advancements, is a cornerstone of informed societies. Effective communication strategies, driven by collaboration, ensure the accurate dissemination of health information and foster positive behavior change. As the world navigates complex health challenges, the continued evolution of this synergy holds the promise of healthier communities and a more engaged and educated public.

Keywords: public awareness, journalism ethics, health promotion, media influence, health literacy

Procedia PDF Downloads 70
79 Concepts of Technologies Based on Smart Materials to Improve Aircraft Aerodynamic Performance

Authors: Krzysztof Skiba, Zbigniew Czyz, Ksenia Siadkowska, Piotr Borowiec

Abstract:

The article presents selected concepts of technologies that use intelligent materials in aircraft in order to improve their performance. Most of the research focuses on solutions that improve the performance of fixed wing aircraft due to related to their previously dominant market share. Recently, the development of the rotorcraft has been intensive, so there are not only helicopters but also gyroplanes and unmanned aerial vehicles using rotors and vertical take-off and landing. There are many different technologies to change a shape of the aircraft or its elements. Piezoelectric, deformable actuator systems can be applied in the system of an active control of vibration dampening in the aircraft tail structure. Wires made of shape memory alloys (SMA) could be used instead of hydraulic cylinders in the rear part of the aircraft flap. The aircraft made of intelligent materials (piezoelectrics and SMA) is one of the NASA projects which provide the possibility of changing a wing shape coefficient by 200%, a wing surface by 50%, and wing deflections by 20 degrees. Active surfaces made of shape memory alloys could be used to control swirls in the flowing stream. An intelligent control system for helicopter blades is a method for the active adaptation of blades to flight conditions and the reduction of vibrations caused by the rotor. Shape memory alloys are capable of recovering their pre-programmed shapes. They are divided into three groups: nickel-titanium-based, copper-based, and ferromagnetic. Due to the strongest shape memory effect and the best vibration damping ability, a Ni-Ti alloy is the most commercially important. The subject of this work was to prepare a conceptual design of a rotor blade with SMA actuators. The scope of work included 3D design of the supporting rotor blade, 3D design of beams enabling to change the geometry by changing the angle of rotation and FEM (Finite Element Method) analysis. The FEM analysis was performed using NX 12 software in the Pre/Post module, which includes extended finite element modeling tools and visualizations of the obtained results. Calculations are presented for two versions of the blade girders. For FEM analysis, three types of materials were used for comparison purposes (ABS, aluminium alloy 7057, steel C45). The analysis of internal stresses and extreme displacements of crossbars edges was carried out. The internal stresses in all materials were close to the yield point in the solution of girder no. 1. For girder no. 2 solution, the value of stresses decreased by about 45%. As a result of the displacement analysis, it was found that the best solution was the ABS girder no. 1. The displacement of about 0.5 mm was obtained, which resulted in turning the crossbars (upper and lower) by an angle equal to 3.59 degrees. This is the largest deviation of all the tests. The smallest deviation was obtained for beam no. 2 made of steel. The displacement value of the second girder solution was approximately 30% lower than the first solution. Acknowledgement: This work has been financed by the Polish National Centre for Research and Development under the LIDER program, Grant Agreement No. LIDER/45/0177/L-9/17/NCBR/2018.

Keywords: aircraft, helicopters, shape memory alloy, SMA, smart material, unmanned aerial vehicle, UAV

Procedia PDF Downloads 138
78 Teaching about Justice With Justice: How Using Experiential, Learner Centered Literacy Methodology Enhances Learning of Justice Related Competencies for Young Children

Authors: Bruna Azzari Puga, Richard Roe, Andre Pagani de Souza

Abstract:

abstract outlines a proposed study to examine how and to what extent interactive, experiential, learner centered methodology develops learning of basic civic and democratic competencies among young children. It stems from the Literacy and Law course taught at Georgetown University Law Center in Washington, DC, since 1998. Law students, trained in best literacy practices and legal cases affecting literacy development, read “law related” children’s books and engage in interactive and extension activities with emerging readers. The law students write a monthly journal describing their experiences and a final paper: a conventional paper or a children’s book illuminating some aspect of literacy and law. This proposal is based on the recent adaptation of Literacy and Law to Brazil at Mackenzie Presbyterian University in São Paulo in three forms: first, a course similar to the US model, often conducted jointly online with Brazilian and US law students; second, a similar course that combines readings of children’s literature with activity based learning, with law students from a satellite Mackenzie campus, for young children from a vulnerable community near the city; and third, a course taught by law students at the main Mackenzie campus for 4th grade students at the Mackenzie elementary school, that is wholly activity and discourse based. The workings and outcomes of these courses are well documented by photographs, reports, lesson plans, and law student journals. The authors, faculty who teach the above courses at Mackenzie and Georgetown, observe that literacy, broadly defined as cognitive and expressive development through reading and discourse-based activities, can be influential in developing democratic civic skills, identifiable by explicit civic competencies. For example, children experience justice in the classroom through cooperation, creativity, diversity, fairness, systemic thinking, and appreciation for rules and their purposes. Moreover, the learning of civic skills as well as the literacy skills is enhanced through interactive, learner centered practices in which the learners experience literacy and civic development. This study will develop rubrics for individual and classroom teaching and supervision by examining 1) the children’s books and students diaries of participating law students and 2) the collection of photos and videos of classroom activities, and 3) faculty and supervisor observations and reports. These rubrics, and the lesson plans and activities which are employed to advance the higher levels of performance outcomes, will be useful in training and supervision and in further replication and promotion of this form of teaching and learning. Examples of outcomes include helping, cooperating and participating; appreciation of viewpoint diversity; knowledge and utilization of democratic processes, including due process, advocacy, individual and shared decision making, consensus building, and voting; establishing and valuing appropriate rules and a reasoned approach to conflict resolution. In conclusion, further development and replication of the learner centered literacy and law practices outlined here can lead to improved qualities of democratic teaching and learning supporting mutual respect, positivity, deep learning, and the common good – foundation qualities of a sustainable world.

Keywords: democracy, law, learner-centered, literacy

Procedia PDF Downloads 121
77 In Vitro Intestine Tissue Model to Study the Impact of Plastic Particles

Authors: Ashleigh Williams

Abstract:

Micro- and nanoplastics’ (MNLPs) omnipresence and ecological accumulation is evident when surveying recent environmental impact studies. For example, in 2014 it was estimated that at least 52.3 trillion plastic microparticles are floating at sea, and scientists have even found plastics present remote Arctic ice and snow (5,6). Plastics have even found their way into precipitation, with more than 1000 tons of microplastic rain precipitating onto the Western United States in 2020. Even more recent studies evaluating the chemical safety of reusable plastic bottles found that hundreds of chemicals leached into the control liquid in the bottle (ddH2O, ph = 7) during a 24-hour time period. A consequence of the increased abundance in plastic waste in the air, land, and water every year is the bioaccumulation of MNLPs in ecosystems and trophic niches of the animal food chain, which could potentially cause increased direct and indirect exposure of humans to MNLPs via inhalation, ingestion, and dermal contact. Though the detrimental, toxic effects of MNLPs have been established in marine biota, much less is known about the potentially hazardous health effects of chronic MNLP ingestion in humans. Recent data indicate that long-term exposure to MNLPs could cause possible inflammatory and dysbiotic effects. However, toxicity seems to be largely dose-, as well as size-dependent. In addition, the transcytotic uptake of MNLPs through the intestinal epithelia in humans remain relatively unknown. To this point, the goal of the current study was to investigate the mechanisms of micro- and nanoplastic uptake and transcytosis of Polystyrene (PE) in human stem-cell derived, physiologically relevant in vitro intestinal model systems, and to compare the relative effect of particle size (30 nm, 100 nm, 500 nm and 1 µm), and concentration (0 µg/mL, 250 µg/mL, 500 µg/mL, 1000 µg/mL) on polystyrene MNLP uptake, transcytosis and intestinal epithelial model integrity. Observational and quantitative data obtained from confocal microscopy, immunostaining, transepithelial electrical resistance (TEER) measurements, cryosectioning, and ELISA cytokine assays of the proinflammatory cytokines Interleukin-6 and Interleukin-8 were used to evaluate the localization and transcytosis of polystyrene MNPs and its impact on epithelial integrity in human-derived intestinal in vitro model systems. The effect of Microfold (M) cell induction on polystyrene micro- and nanoparticle (MNP) uptake, transcytosis, and potential inflammation was also assessed and compared to samples grown under standard conditions. Microfold (M) cells, link the human intestinal system to the immune system and are the primary cells in the epithelium responsible for sampling and transporting foreign matter of interest from the lumen of the gut to underlying immune cells. Given the uptake capabilities of Microfold cells to interact both specifically and nonspecific to abiotic and biotic materials, it was expected that M- cell induced in vitro samples would have increased binding, localization, and potentially transcytosis of Polystyrene MNLPs across the epithelial barrier. Experimental results of this study would not only help in the evaluation of the plastic toxicity, but would allow for more detailed modeling of gut inflammation and the intestinal immune system.

Keywords: nanoplastics, enteroids, intestinal barrier, tissue engineering, microfold (M) cells

Procedia PDF Downloads 84
76 Design and Implementation of an Affordable Electronic Medical Records in a Rural Healthcare Setting: A Qualitative Intrinsic Phenomenon Case Study

Authors: Nitika Sharma, Yogesh Jain

Abstract:

Introduction: An efficient Information System helps in improving the service delivery as well provides the foundation for policy and regulation of other building blocks of Health System. Health care organizations require an integrated working of its various sub-systems. An efficient EMR software boosts the teamwork amongst the various sub-systems thereby resulting in improved service delivery. Although there has been a huge impetus to EMR under the Digital India initiative, it has still not been mandated in India. It is generally implemented in huge funded public or private healthcare organizations only. Objective: The study was conducted to understand the factors that lead to the successful adoption of an affordable EMR in the low level healthcare organization. It intended to understand the design of the EMR and address the solutions to the challenges faced in adoption of the EMR. Methodology: The study was conducted in a non-profit registered Healthcare organization that has been providing healthcare facilities to more than 2500 villages including certain areas that are difficult to access. The data was collected with help of field notes, in-depth interviews and participant observation. A total of 16 participants using the EMR from different departments were enrolled via purposive sampling technique. The participants included in the study were working in the organization before the implementation of the EMR system. The study was conducted in one month period from 25 June-20 July 2018. The Ethical approval was taken from the institute along with prior approval of the participants. Data analysis: A word document of more than 4000 words was obtained after transcribing and translating the answers of respondents. It was further analyzed by focused coding, a line by line review of the transcripts, underlining words, phrases or sentences that might suggest themes to do thematic narrative analysis. Results: Based on the answers the results were thematically grouped under four headings: 1. governance of organization, 2. architecture and design of the software, 3. features of the software, 4. challenges faced in adoption and the solutions to address them. It was inferred that the successful implementation was attributed to the easy and comprehensive design of the system which has facilitated not only easy data storage and retrieval but contributes in constructing a decision support system for the staff. Portability has lead to increased acceptance by physicians. The proper division of labor, increased efficiency of staff, incorporation of auto-correction features and facilitation of task shifting has lead to increased acceptance amongst the users of various departments. Geographical inhibitions, low computer literacy and high patient load were the major challenges faced during its implementation. Despite of dual efforts made both by the architects and administrators to combat these challenges, there are still certain ongoing challenges faced by organization. Conclusion: Whenever any new technology is adopted there are certain innovators, early adopters, late adopters and laggards. The same pattern was followed in adoption of this software. He challenges were overcome with joint efforts of organization administrators and users as well. Thereby this case study provides a framework of implementing similar systems in public sector of countries that are struggling for digitizing the healthcare in presence of crunch of human and financial resources.

Keywords: EMR, healthcare technology, e-health, EHR

Procedia PDF Downloads 105
75 Extension of Moral Agency to Artificial Agents

Authors: Sofia Quaglia, Carmine Di Martino, Brendan Tierney

Abstract:

Artificial Intelligence (A.I.) constitutes various aspects of modern life, from the Machine Learning algorithms predicting the stocks on Wall streets to the killing of belligerents and innocents alike on the battlefield. Moreover, the end goal is to create autonomous A.I.; this means that the presence of humans in the decision-making process will be absent. The question comes naturally: when an A.I. does something wrong when its behavior is harmful to the community and its actions go against the law, which is to be held responsible? This research’s subject matter in A.I. and Robot Ethics focuses mainly on Robot Rights and its ultimate objective is to answer the questions: (i) What is the function of rights? (ii) Who is a right holder, what is personhood and the requirements needed to be a moral agent (therefore, accountable for responsibility)? (iii) Can an A.I. be a moral agent? (ontological requirements) and finally (iv) if it ought to be one (ethical implications). With the direction to answer this question, this research project was done via a collaboration between the School of Computer Science in the Technical University of Dublin that oversaw the technical aspects of this work, as well as the Department of Philosophy in the University of Milan, who supervised the philosophical framework and argumentation of the project. Firstly, it was found that all rights are positive and based on consensus; they change with time based on circumstances. Their function is to protect the social fabric and avoid dangerous situations. The same goes for the requirements considered necessary to be a moral agent: those are not absolute; in fact, they are constantly redesigned. Hence, the next logical step was to identify what requirements are regarded as fundamental in real-world judicial systems, comparing them to that of ones used in philosophy. Autonomy, free will, intentionality, consciousness and responsibility were identified as the requirements to be considered a moral agent. The work went on to build a symmetrical system between personhood and A.I. to enable the emergence of the ontological differences between the two. Each requirement is introduced, explained in the most relevant theories of contemporary philosophy, and observed in its manifestation in A.I. Finally, after completing the philosophical and technical analysis, conclusions were drawn. As underlined in the research questions, there are two issues regarding the assignment of moral agency to artificial agent: the first being that all the ontological requirements must be present and secondly being present or not, whether an A.I. ought to be considered as an artificial moral agent. From an ontological point of view, it is very hard to prove that an A.I. could be autonomous, free, intentional, conscious, and responsible. The philosophical accounts are often very theoretical and inconclusive, making it difficult to fully detect these requirements on an experimental level of demonstration. However, from an ethical point of view it makes sense to consider some A.I. as artificial moral agents, hence responsible for their own actions. When considering artificial agents as responsible, there can be applied already existing norms in our judicial system such as removing them from society, and re-educating them, in order to re-introduced them to society. This is in line with how the highest profile correctional facilities ought to work. Noticeably, this is a provisional conclusion and research must continue further. Nevertheless, the strength of the presented argument lies in its immediate applicability to real world scenarios. To refer to the aforementioned incidents, involving the murderer of innocents, when this thesis is applied it is possible to hold an A.I. accountable and responsible for its actions. This infers removing it from society by virtue of its un-usability, re-programming it and, only when properly functioning, re-introducing it successfully

Keywords: artificial agency, correctional system, ethics, natural agency, responsibility

Procedia PDF Downloads 188
74 The Use of Artificial Intelligence in the Context of a Space Traffic Management System: Legal Aspects

Authors: George Kyriakopoulos, Photini Pazartzis, Anthi Koskina, Crystalie Bourcha

Abstract:

The need for securing safe access to and return from outer space, as well as ensuring the viability of outer space operations, maintains vivid the debate over the promotion of organization of space traffic through a Space Traffic Management System (STM). The proliferation of outer space activities in recent years as well as the dynamic emergence of the private sector has gradually resulted in a diverse universe of actors operating in outer space. The said developments created an increased adverse impact on outer space sustainability as the case of the growing number of space debris clearly demonstrates. The above landscape sustains considerable threats to outer space environment and its operators that need to be addressed by a combination of scientific-technological measures and regulatory interventions. In this context, recourse to recent technological advancements and, in particular, to Artificial Intelligence (AI) and machine learning systems, could achieve exponential results in promoting space traffic management with respect to collision avoidance as well as launch and re-entry procedures/phases. New technologies can support the prospects of a successful space traffic management system at an international scale by enabling, inter alia, timely, accurate and analytical processing of large data sets and rapid decision-making, more precise space debris identification and tracking and overall minimization of collision risks and reduction of operational costs. What is more, a significant part of space activities (i.e. launch and/or re-entry phase) takes place in airspace rather than in outer space, hence the overall discussion also involves the highly developed, both technically and legally, international (and national) Air Traffic Management System (ATM). Nonetheless, from a regulatory perspective, the use of AI for the purposes of space traffic management puts forward implications that merit particular attention. Key issues in this regard include the delimitation of AI-based activities as space activities, the designation of the applicable legal regime (international space or air law, national law), the assessment of the nature and extent of international legal obligations regarding space traffic coordination, as well as the appropriate liability regime applicable to AI-based technologies when operating for space traffic coordination, taking into particular consideration the dense regulatory developments at EU level. In addition, the prospects of institutionalizing international cooperation and promoting an international governance system, together with the challenges of establishment of a comprehensive international STM regime are revisited in the light of intervention of AI technologies. This paper aims at examining regulatory implications advanced by the use of AI technology in the context of space traffic management operations and its key correlating concepts (SSA, space debris mitigation) drawing in particular on international and regional considerations in the field of STM (e.g. UNCOPUOS, International Academy of Astronautics, European Space Agency, among other actors), the promising advancements of the EU approach to AI regulation and, last but not least, national approaches regarding the use of AI in the context of space traffic management, in toto. Acknowledgment: The present work was co-funded by the European Union and Greek national funds through the Operational Program "Human Resources Development, Education and Lifelong Learning " (NSRF 2014-2020), under the call "Supporting Researchers with an Emphasis on Young Researchers – Cycle B" (MIS: 5048145).

Keywords: artificial intelligence, space traffic management, space situational awareness, space debris

Procedia PDF Downloads 256
73 Enhancing the Implementation Strategy of Simultaneous Operations (SIMOPS) for the Major Turnaround at Pertamina Plaju Refinery

Authors: Fahrur Rozi, Daniswara Krisna Prabatha, Latief Zulfikar Chusaini

Abstract:

Amidst the backdrop of Pertamina Plaju Refinery, which stands as the oldest and historically less technologically advanced among Pertamina's refineries, lies a unique challenge. Originally integrating facilities established by Shell in 1904 and Stanvac (originally Standard Oil) in 1926, the primary challenge at Plaju Refinery does not solely revolve around complexity; instead, it lies in ensuring reliability, considering its operational history of over a century. After centuries of existence, Plaju Refinery has never undergone a comprehensive major turnaround encompassing all its units. The usual practice involves partial turnarounds that are sequentially conducted across its primary, secondary, and tertiary units (utilities and offsite). However, a significant shift is on the horizon. In the Q-IV of 2023, the refinery embarks on its first-ever major turnaround since its establishment. This decision was driven by the alignment of maintenance timelines across various units. Plaju Refinery's major turnaround was scheduled for October-November 2023, spanning 45 calendar days, with the objective of enhancing the operational reliability of all refinery units. The extensive job list for this turnaround encompasses 1583 tasks across 18 units/areas, involving approximately 9000 contracted workers. In this context, the Strategy of Simultaneous Operations (SIMOPS) execution emerges as a pivotal tool to optimize time efficiency and ensure safety. A Hazard Effect Management Process (HEMP) has been employed to assess the risk ratings of each task within the turnaround. Out of the tasks assessed, 22 are deemed high-risk and necessitate mitigation. The SIMOPS approach serves as a preventive measure against potential incidents. It is noteworthy that every turnaround period at Pertamina Plaju Refinery involves SIMOPS-related tasks. In this context, enhancing the implementation strategy of "Simultaneous Operations (SIMOPS)" becomes imperative to minimize the occurrence of incidents. At least four improvements have been introduced in the enhancement process for the major turnaround at Refinery Plaju. The first improvement involves conducting systematic risk assessment and potential hazard mitigation studies for SIMOPS tasks before task execution, as opposed to the previous on-site approach. The second improvement includes the completion of SIMOPS Job Mitigation and Work Matrices Sheets, which was often neglected in the past. The third improvement emphasizes comprehensive awareness to workers/contractors regarding potential hazards and mitigation strategies for SIMOPS tasks before and during the major turnaround. The final improvement is the introduction of a daily program for inspecting and observing work in progress for SIMOPS tasks. Prior to these improvements, there was no established program for monitoring ongoing activities related to SIMOPS tasks during the turnaround. This study elucidates the steps taken to enhance SIMOPS within Pertamina, drawing from the experiences of Plaju Refinery as a guide. A real actual case study will be provided from our experience in the operational unit. In conclusion, these efforts are essential for the success of the first-ever major turnaround at Plaju Refinery, with the SIMOPS strategy serving as a central component. Based on these experiences, enhancements have been made to Pertamina's official Internal Guidelines for Executing SIMOPS Risk Mitigation, benefiting all Pertamina units.

Keywords: process safety management, turn around, oil refinery, risk assessment

Procedia PDF Downloads 72
72 Modelling Spatial Dynamics of Terrorism

Authors: André Python

Abstract:

To this day, terrorism persists as a worldwide threat, exemplified by the recent deadly attacks in January 2015 in Paris and the ongoing massacres perpetrated by ISIS in Iraq and Syria. In response to this threat, states deploy various counterterrorism measures, the cost of which could be reduced through effective preventive measures. In order to increase the efficiency of preventive measures, policy-makers may benefit from accurate predictive models that are able to capture the complex spatial dynamics of terrorism occurring at a local scale. Despite empirical research carried out at country-level that has confirmed theories explaining the diffusion processes of terrorism across space and time, scholars have failed to assess diffusion’s theories on a local scale. Moreover, since scholars have not made the most of recent statistical modelling approaches, they have been unable to build up predictive models accurate in both space and time. In an effort to address these shortcomings, this research suggests a novel approach to systematically assess the theories of terrorism’s diffusion on a local scale and provide a predictive model of the local spatial dynamics of terrorism worldwide. With a focus on the lethal terrorist events that occurred after 9/11, this paper addresses the following question: why and how does lethal terrorism diffuse in space and time? Based on geolocalised data on worldwide terrorist attacks and covariates gathered from 2002 to 2013, a binomial spatio-temporal point process is used to model the probability of terrorist attacks on a sphere (the world), the surface of which is discretised in the form of Delaunay triangles and refined in areas of specific interest. Within a Bayesian framework, the model is fitted through an integrated nested Laplace approximation - a recent fitting approach that computes fast and accurate estimates of posterior marginals. Hence, for each location in the world, the model provides a probability of encountering a lethal terrorist attack and measures of volatility, which inform on the model’s predictability. Diffusion processes are visualised through interactive maps that highlight space-time variations in the probability and volatility of encountering a lethal attack from 2002 to 2013. Based on the previous twelve years of observation, the location and lethality of terrorist events in 2014 are statistically accurately predicted. Throughout the global scope of this research, local diffusion processes such as escalation and relocation are systematically examined: the former process describes an expansion from high concentration areas of lethal terrorist events (hotspots) to neighbouring areas, while the latter is characterised by changes in the location of hotspots. By controlling for the effect of geographical, economical and demographic variables, the results of the model suggest that the diffusion processes of lethal terrorism are jointly driven by contagious and non-contagious factors that operate on a local scale – as predicted by theories of diffusion. Moreover, by providing a quantitative measure of predictability, the model prevents policy-makers from making decisions based on highly uncertain predictions. Ultimately, this research may provide important complementary tools to enhance the efficiency of policies that aim to prevent and combat terrorism.

Keywords: diffusion process, terrorism, spatial dynamics, spatio-temporal modeling

Procedia PDF Downloads 349
71 The Use of Non-Parametric Bootstrap in Computing of Microbial Risk Assessment from Lettuce Consumption Irrigated with Contaminated Water by Sanitary Sewage in Infulene Valley

Authors: Mario Tauzene Afonso Matangue, Ivan Andres Sanchez Ortiz

Abstract:

The Metropolitan area of Maputo (Mozambique Capital City) is located in semi-arid zone (800 mm annual rainfall) with 1101170 million inhabitants. On the west side, there are the flatlands of Infulene where the Mulauze River flows towards to the Indian Ocean, receiving at this site, the storm water contaminated with sanitary sewage from Maputo, transported through a concrete open channel. In Infulene, local communities grow salads crops such as tomato, onion, garlic, lettuce, and cabbage, which are then commercialized and consumed in several markets in Maputo City. Lettuce is the most daily consumed salad crop in different meals, generally in fast-foods, breakfasts, lunches, and dinners. However, the risk of infection by several pathogens due to the consumption of lettuce, using the Quantitative Microbial Risk Assessment (QMRA) tools, is still unknown since there are few studies or publications concerning to this matter in Mozambique. This work is aimed at determining the annual risk arising from the consumption of lettuce grown in Infulene valley, in Maputo, using QMRA tools. The exposure model was constructed upon the volume of contaminated water remaining in the lettuce leaves, the empirical relations between the number of pathogens and the indicator of microorganisms (E. coli), the consumption of lettuce (g) and reduction of pathogens (days). The reference pathogens were Vibrio cholerae, Cryptosporidium, norovirus, and Ascaris. The water quality samples (E. coli) were collected in the storm water channel from January 2016 to December 2018, comprising 65 samples, and the urban lettuce consumption data were collected through inquiry in Maputo Metropolis covering 350 persons. A non-parametric bootstrap was performed involving 10,000 iterations over the collected dataset, namely, water quality (E. coli) and lettuce consumption. The dose-response models were: Exponential for Cryptosporidium, Kummer Confluent hypergeomtric function (1F1) for Vibrio and Ascaris Gaussian hypergeometric function (2F1-(a,b;c;z) for norovirus. The annual infection risk estimates were performed using R 3.6.0 (CoreTeam) software by Monte Carlo (Latin hypercubes), a sampling technique involving 10,000 iterations. The annual infection risks values expressed by Median and the 95th percentile, per person per year (pppy) arising from the consumption of lettuce are as follows: Vibrio cholerae (1.00, 1.00), Cryptosporidium (3.91x10⁻³, 9.72x 10⁻³), nororvirus (5.22x10⁻¹, 9.99x10⁻¹) and Ascaris (2.59x10⁻¹, 9.65x10⁻¹). Thus, the consumption of the lettuce would result in greater risks than the tolerable levels ( < 10⁻³ pppy or 10⁻⁶ DALY) for all pathogens, and the Vibrio cholerae is the most virulent pathogens, according to the hit-single models followed by the Ascaris lumbricoides and norovirus. The sensitivity analysis carried out in this work pointed out that in the whole QMRA, the most important input variable was the reduction of pathogens (Spearman rank value was 0.69) between harvest and consumption followed by water quality (Spearman rank value was 0.69). The decision-makers (Mozambique Government) must strengthen the prevention measures related to pathogens reduction in lettuce (i.e., washing) and engage in wastewater treatment engineering.

Keywords: annual infections risk, lettuce, non-parametric bootstrapping, quantitative microbial risk assessment tools

Procedia PDF Downloads 120
70 IoT Continuous Monitoring Biochemical Oxygen Demand Wastewater Effluent Quality: Machine Learning Algorithms

Authors: Sergio Celaschi, Henrique Canavarro de Alencar, Claaudecir Biazoli

Abstract:

Effluent quality is of the highest priority for compliance with the permit limits of environmental protection agencies and ensures the protection of their local water system. Of the pollutants monitored, the biochemical oxygen demand (BOD) posed one of the greatest challenges. This work presents a solution for wastewater treatment plants - WWTP’s ability to react to different situations and meet treatment goals. Delayed BOD5 results from the lab take 7 to 8 analysis days, hindered the WWTP’s ability to react to different situations and meet treatment goals. Reducing BOD turnaround time from days to hours is our quest. Such a solution is based on a system of two BOD bioreactors associated with Digital Twin (DT) and Machine Learning (ML) methodologies via an Internet of Things (IoT) platform to monitor and control a WWTP to support decision making. DT is a virtual and dynamic replica of a production process. DT requires the ability to collect and store real-time sensor data related to the operating environment. Furthermore, it integrates and organizes the data on a digital platform and applies analytical models allowing a deeper understanding of the real process to catch sooner anomalies. In our system of continuous time monitoring of the BOD suppressed by the effluent treatment process, the DT algorithm for analyzing the data uses ML on a chemical kinetic parameterized model. The continuous BOD monitoring system, capable of providing results in a fraction of the time required by BOD5 analysis, is composed of two thermally isolated batch bioreactors. Each bioreactor contains input/output access to wastewater sample (influent and effluent), hydraulic conduction tubes, pumps, and valves for batch sample and dilution water, air supply for dissolved oxygen (DO) saturation, cooler/heater for sample thermal stability, optical ODO sensor based on fluorescence quenching, pH, ORP, temperature, and atmospheric pressure sensors, local PLC/CPU for TCP/IP data transmission interface. The dynamic BOD system monitoring range covers 2 mg/L < BOD < 2,000 mg/L. In addition to the BOD monitoring system, there are many other operational WWTP sensors. The CPU data is transmitted/received to/from the digital platform, which in turn performs analyses at periodic intervals, aiming to feed the learning process. BOD bulletins and their credibility intervals are made available in 12-hour intervals to web users. The chemical kinetics ML algorithm is composed of a coupled system of four first-order ordinary differential equations for the molar masses of DO, organic material present in the sample, biomass, and products (CO₂ and H₂O) of the reaction. This system is solved numerically linked to its initial conditions: DO (saturated) and initial products of the kinetic oxidation process; CO₂ = H₂0 = 0. The initial values for organic matter and biomass are estimated by the method of minimization of the mean square deviations. A real case of continuous monitoring of BOD wastewater effluent quality is being conducted by deploying an IoT application on a large wastewater purification system located in S. Paulo, Brazil.

Keywords: effluent treatment, biochemical oxygen demand, continuous monitoring, IoT, machine learning

Procedia PDF Downloads 72
69 A Multi-Model Approach to Assess Atlantic Bonito (Sarda Sarda, Bloch 1793) in the Eastern Atlantic Ocean: A Case Study of the Senegalese Exclusive Economic Zone

Authors: Ousmane Sarr

Abstract:

The Senegalese coasts have high productivity of fishery resources due to the frequency of intense up-welling system that occurs along its coast, caused by the maritime trade winds making its waters nutrients rich. Fishing plays a primordial role in Senegal's socioeconomic plans and food security. However, a global diagnosis of the Senegalese maritime fishing sector has highlighted the challenges this sector encounters. Among these concerns, some significant stocks, a priority target for artisanal fishing, need further assessment. If no efforts are made in this direction, most stock will be overexploited or even in decline. It is in this context that this research was initiated. This investigation aimed to apply a multi-modal approach (LBB, Catch-only-based CMSY model and its most recent version (CMSY++); JABBA, and JABBA-Select) to assess the stock of Atlantic bonito, Sarda sarda (Bloch, 1793) in the Senegalese Exclusive Economic Zone (SEEZ). Available catch, effort, and size data from Atlantic bonito over 15 years (2004-2018) were used to calculate the nominal and standardized CPUE, size-frequency distribution, and length at retentions (50 % and 95 % selectivity) of the species. These relevant results were employed as input parameters for stock assessment models mentioned above to define the stock status of this species in this region of the Atlantic Ocean. The LBB model indicated an Atlantic bonito healthy stock status with B/BMSY values ranging from 1.3 to 1.6 and B/B0 values varying from 0.47 to 0.61 of the main scenarios performed (BON_AFG_CL, BON_GN_Length, and BON_PS_Length). The results estimated by LBB are consistent with those obtained by CMSY. The CMSY model results demonstrate that the SEEZ Atlantic bonito stock is in a sound condition in the final year of the main scenarios analyzed (BON, BON-bt, BON-GN-bt, and BON-PS-bt) with sustainable relative stock biomass (B2018/BMSY = 1.13 to 1.3) and fishing pressure levels (F2018/FMSY= 0.52 to 1.43). The B/BMSY and F/FMSY results for the JABBA model ranged between 2.01 to 2.14 and 0.47 to 0.33, respectively. In contrast, The estimated B/BMSY and F/FMSY for JABBA-Select ranged from 1.91 to 1.92 and 0.52 to 0.54. The Kobe plots results of the base case scenarios ranged from 75% to 89% probability in the green area, indicating sustainable fishing pressure and an Atlantic bonito healthy stock size capable of producing high yields close to the MSY. Based on the stock assessment results, this study highlighted scientific advice for temporary management measures. This study suggests an improvement of the selectivity parameters of longlines and purse seines and a temporary prohibition of the use of sleeping nets in the fishery for the Atlantic bonito stock in the SEEZ based on the results of the length-base models. Although these actions are temporary, they can be essential to reduce or avoid intense pressure on the Atlantic bonito stock in the SEEZ. However, it is necessary to establish harvest control rules to provide coherent and solid scientific information that leads to appropriate decision-making for rational and sustainable exploitation of Atlantic bonito in the SEEZ and the Eastern Atlantic Ocean.

Keywords: multi-model approach, stock assessment, atlantic bonito, healthy stock, sustainable, SEEZ, temporary management measures

Procedia PDF Downloads 57
68 How the Writer Tells the Story Should Be the Primary Concern rather than Who Can Write about Whom: The Limits of Cultural Appropriation Vis-à-Vis The Ethics of Narrative Empathy

Authors: Alexandra Cheira

Abstract:

Cultural appropriation has been theorised as a form of colonialism in which members of a dominant culture reduce cultural elements that are deeply meaningful to a minority culture to the category of the “exotic other” since they do not experience the oppression and discriminations faced by members of the minority culture. Yet, in the particular case of literature, writers such as Lionel Shriver and Bernardine Evaristo have argued that authors from a cultural majority have a right to write in the voice of someone from a cultural minority, hence attacking the idea that this is a form of cultural appropriation. By definition, Shriver and Evaristo claim, writers are supposed to write beyond their own culture, gender, class, and/ or race. In this light, this paper discusses the limits of cultural appropriation vis-à-vis the ethics of narrative empathy by addressing the mixed critical reception of Kathryn Stockett’s The Help (2009) and Jeanine Cummins’s American Dirt (2020). In fact, both novels were acclaimed as global eye-openers regarding the struggles of respectively South American migrants and African American maids. At the same time, both novelists have been accused of cultural appropriation by telling a story that is not theirs to tell, given the fact that they are white women telling these stories in what critics have argued is really an American voice telling a story to American readers.These claims will be investigated within the framework of Edward Said’s foundational examination of Orientalism in the field of postcolonial studies as a Western style for authoritatively restructuring the Orient. This means that Orientalist stereotypes regarding Eastern cultures have implicitly validated colonial and imperial pursuits, in the specific context of literary representations of African American and Mexican cultures by white writers. At the same time, the conflicted reception of American Dirt and The Help will be examined within the critical framework of narrative empathy as theorised by Suzanne Keen. Hence, there will be a particular focus on the way a reader’s heated perception that the author’s perspective is purely dishonest can result from a friction between an author’s intention and a reader’s experience of narrative empathy, while a shared sense of empathy between authors and readers can be a rousing momentum to move beyond literary response to social action.Finally, in order to assess that “the key question should not be who can write about whom, but how the writer tells the story”, the recent controversy surrounding Dutch author Marieke Lucas Rijneveld’s decision to resign the translation of American poet Amanda Gorman’s work into Dutch will be duly investigated. In fact, Rijneveld stepped out after journalist and activist Janice Deul criticised Dutch publisher Meulenhoff for choosing a translator who was not also Black, despite the fact that 22-year-old Gorman had selected the 29-year-old Rijneveld herself, as a fellow young writer who had likewise come to fame early on in life. In this light, the critical argument that the controversial reception of The Help reveals as much about US race relations in the early twenty-first century as about the complex literary transactions between individual readers and the novel itself will also be discussed in the extended context of American Dirt and white author Marieke Rijneveld’s withdrawal from the projected translation of Black poet Amanda Gorman.

Keywords: cultural appropriation, cultural stereotypes, narrative empathy, race relations

Procedia PDF Downloads 68
67 The Roots of Amazonia’s Droughts and Floods: Complex Interactions of Pacific and Atlantic Sea-Surface Temperatures

Authors: Rosimeire Araújo Silva, Philip Martin Fearnside

Abstract:

Extreme droughts and floods in the Amazon have serious consequences for natural ecosystems and the human population in the region. The frequency of these events has increased in recent years, and projections of climate change predict greater frequency and intensity of these events. Understanding the links between these extreme events and different patterns of sea surface temperature in the Atlantic and Pacific Oceans is essential, both to improve the modeling of climate change and its consequences and to support efforts of adaptation in the region. The relationship between sea temperatures and events in the Amazon is much more complex than is usually assumed in climatic models. Warming and cooling of different parts of the oceans, as well as the interaction between simultaneous temperature changes in different parts of each ocean and between the two oceans, have specific consequences for the Amazon, with effects on precipitation that vary in different parts of the region. Simplistic generalities, such as the association between El Niño events and droughts in the Amazon, do not capture this complexity. We investigated the variability of Sea Surface Temperature (SST) in the Tropical Pacific Ocean during the period 1950-2022, using Empirical Orthogonal Functions (FOE), spectral analysis coherence and wavelet phase. The two were identified as the main modes of variability, which explain about 53,9% and 13,3%, respectively, of the total variance of the data. The spectral and coherence analysis and wavelets phase showed that the first selected mode represents the warming in the central part of the Pacific Ocean (the “Central El Niño”), while the second mode represents warming in the eastern part of the Pacific (the “Eastern El Niño The effects of the 1982-1983 and 1976-1977 El Niño events in the Amazon, although both events were characterized by an increase in sea surface temperatures in the Equatorial Pacific, the impact on rainfall in the Amazon was distinct. In the rainy season, from December to March, the sub-basins of the Japurá, Jutaí, Jatapu, Tapajós, Trombetas and Xingu rivers were the regions that showed the greatest reductions in rainfall associated with El Niño Central (1982-1983), while the sub-basins of the Javari, Purus, Negro and Madeira rivers had the most pronounced reductions in the year of Eastern El Niño (1976-1977). In the transition to the dry season, in April, the greatest reductions were associated with the Eastern El Niño year for the majority of the study region, with the exception only of the sub-basins of the Madeira, Trombetas and Xingu rivers, which had their associated reductions to Central El Niño. In the dry season from July to September, the sub-basins of the Japurá Jutaí Jatapu Javari Trombetas and Madeira rivers were the rivers that showed the greatest reductions in rainfall associated with El Niño Central, while the sub-basins of the Tapajós Purus Negro and Xingu rivers had the most pronounced reductions. In the Eastern El Niño year this season. In this way, it is possible to conclude that the Central (Eastern) El Niño controlled the reductions in soil moisture in the dry (rainy) season for all sub-basins shown in this study. Extreme drought events associated with these meteorological phenomena can lead to a significant increase in the occurrence of forest fires. These fires have a devastating impact on Amazonian vegetation, resulting in the irreparable loss of biodiversity and the release of large amounts of carbon stored in the forest, contributing to the increase in the greenhouse effect and global climate change.

Keywords: sea surface temperature, variability, climate, Amazon

Procedia PDF Downloads 63
66 Meta-Analysis of Previously Unsolved Cases of Aviation Mishaps Employing Molecular Pathology

Authors: Michael Josef Schwerer

Abstract:

Background: Analyzing any aircraft accident is mandatory based on the regulations of the International Civil Aviation Organization and the respective country’s criminal prosecution authorities. Legal medicine investigations are unavoidable when fatalities involve the flight crew or when doubts arise concerning the pilot’s aeromedical health status before the event. As a result of frequently tremendous blunt and sharp force trauma along with the impact of the aircraft to the ground, consecutive blast or fire exposition of the occupants or putrefaction of the dead bodies in cases of delayed recovery, relevant findings can be masked or destroyed and therefor being inaccessible in standard pathology practice comprising just forensic autopsy and histopathology. Such cases are of considerable risk of remaining unsolved without legal consequences for those responsible. Further, no lessons can be drawn from these scenarios to improve flight safety and prevent future mishaps. Aims and Methods: To learn from previously unsolved aircraft accidents, re-evaluations of the investigation files and modern molecular pathology studies were performed. Genetic testing involved predominantly PCR-based analysis of gene regulation, studying DNA promotor methylations, RNA transcription and posttranscriptional regulation. In addition, the presence or absence of infective agents, particularly DNA- and RNA-viruses, was studied. Technical adjustments of molecular genetic procedures when working with archived sample material were necessary. Standards for the proper interpretation of the respective findings had to be settled. Results and Discussion: Additional molecular genetic testing significantly contributes to the quality of forensic pathology assessment in aviation mishaps. Previously undetected cardiotropic viruses potentially explain e.g., a pilot’s sudden incapacitation resulting from cardiac failure or myocardial arrhythmia. In contrast, negative results for infective agents participate in ruling out concerns about an accident pilot’s fitness to fly and the aeromedical examiner’s precedent decision to issue him or her an aeromedical certificate. Care must be taken in the interpretation of genetic testing for pre-existing diseases such as hypertrophic cardiomyopathy or ischemic heart disease. Molecular markers such as mRNAs or miRNAs, which can establish these diagnoses in clinical patients, might be misleading in-flight crew members because of adaptive changes in their tissues resulting from repeated mild hypoxia during flight, for instance. Military pilots especially demonstrate significant physiological adjustments to their somatic burdens in flight, such as cardiocirculatory stress and air combat maneuvers. Their non-pathogenic alterations in gene regulation and expression will likely be misinterpreted for genuine disease by inexperienced investigators. Conclusions: The growing influence of molecular pathology on legal medicine practice has found its way into aircraft accident investigation. As appropriate quality standards for laboratory work and data interpretation are provided, forensic genetic testing supports the medico-legal analysis of aviation mishaps and potentially reduces the number of unsolved events in the future.

Keywords: aviation medicine, aircraft accident investigation, forensic pathology, molecular pathology

Procedia PDF Downloads 42
65 Examining Three Psychosocial Factors of Tax Compliance in Self-Employed Individuals using the Mindspace Framework - Evidence from Australia and Pakistan

Authors: Amna Tariq Shah

Abstract:

Amid the pandemic, the contemporary landscape has experienced accelerated growth in small business activities and an expanding digital marketplace, further exacerbating the issue of non-compliance among self-employed individuals through aggressive tax planning and evasion. This research seeks to address these challenges by developing strategic tax policies that promote voluntary compliance and improve taxpayer facilitation. The study employs the innovative MINDSPACE framework to examine three psychosocial factors—tax communication, tax literacy, and shaming—to optimize policy responses, address administrative shortcomings, and ensure adequate revenue collection for public goods and services. Preliminary findings suggest that incomprehensible communication from tax authorities drives individuals to seek alternative, potentially biased sources of tax information, thereby exacerbating non-compliance. Furthermore, the study reveals low tax literacy among Australian and Pakistani respondents, with many struggling to navigate complex tax processes and comprehend tax laws. Consequently, policy recommendations include simplifying tax return filing and enhancing pre-populated tax returns. In terms of shaming, the research indicates that Australians, being an individualistic society, may not respond well to shaming techniques due to privacy concerns. In contrast, Pakistanis, as a collectivistic society, may be more receptive to naming and shaming approaches. The study employs a mixed-method approach, utilizing interviews and surveys to analyze the issue in both jurisdictions. The use of mixed methods allows for a more comprehensive understanding of tax compliance behavior, combining the depth of qualitative insights with the generalizability of quantitative data, ultimately leading to more robust and well-informed policy recommendations. By examining evidence from opposite jurisdictions, namely a developed country (Australia) and a developing country (Pakistan), the study's applicability is enhanced, providing perspectives from two disparate contexts that offer insights from opposite ends of the economic, cultural, and social spectra. The non-comparative case study methodology offers valuable insights into human behavior, which can be applied to other jurisdictions as well. The application of the MINDSPACE framework in this research is particularly significant, as it introduces a novel approach to tax compliance behavior analysis. By integrating insights from behavioral economics, the framework enables a comprehensive understanding of the psychological and social factors influencing taxpayer decision-making, facilitating the development of targeted and effective policy interventions. This research carries substantial importance as it addresses critical challenges in tax compliance and administration, with far-reaching implications for revenue collection and the provision of public goods and services. By investigating the psychosocial factors that influence taxpayer behavior and utilizing the MINDSPACE framework, the study contributes invaluable insights to the field of tax policy. These insights can inform policymakers and tax administrators in developing more effective tax policies that enhance taxpayer facilitation, address administrative obstacles, promote a more equitable and efficient tax system, and foster voluntary compliance, ultimately strengthening the financial foundation of governments and communities.

Keywords: individual tax compliance behavior, psychosocial factors, tax non-compliance, tax policy

Procedia PDF Downloads 74
64 Factors Associated with Risky Sexual Behaviour in Adolescent Girls and Young Women in Cambodia: A Systematic Review

Authors: Farwa Rizvi, Joanne Williams, Humaira Maheen, Elizabeth Hoban

Abstract:

There is an increase in risky sexual behavior and unsafe sex in adolescent girls and young women aged 15 to 24 years in Cambodia, which negatively affects their reproductive health by increasing the risk of contracting sexually transmitted infections and unintended pregnancies. Risky sexual behavior includes ‘having sex at an early age, having multiple sexual partners, having sex while under the influence of alcohol or drugs, and unprotected sexual behaviors’. A systematic review of quantitative research conducted in Cambodia was undertaken, using the theoretical framework of the Social Ecological Model to identify the personal, social and cultural factors associated with risky sexual behavior and unsafe sex in young Cambodian women. PRISMA guidelines were used to search databases including Medline Complete, PsycINFO, CINAHL Complete, Academic Search Complete, Global Health, and Social Work Abstracts. Additional searches were conducted in Science Direct, Google Scholar and in the grey literature sources. A risk-of-bias tool developed explicitly for the systematic review of cross-sectional studies was used. Summary item on the overall risk of study bias after the inter-rater response showed that the risk-of-bias was high in two studies, moderate in one study and low in one study. The search strategy included a combination of subject terms and free text terms. The medical subject headings (MeSH) terms included were; contracept* or ‘birth control’ or ‘family planning’ or pregnan* or ‘safe sex’ or ‘protected intercourse’ or ‘unprotected intercourse’ or ‘protected sex’ or ‘unprotected sex’ or ‘risky sexual behaviour*’ or ‘abort*’ or ‘planned parenthood’ or ‘unplanned pregnancy’ AND ( barrier* or obstacle* or challenge* or knowledge or attitude* or factor* or determinant* or choic* or uptake or discontinu* or acceptance or satisfaction or ‘needs assessment’ or ‘non-use’ or ‘unmet need’ or ‘decision making’ ) AND Cambodia*. Initially, 300 studies were identified by using key words and finally, four quantitative studies were selected based on the inclusion criteria. The four studies were published between 2010 and 2016. The study participants ranged in age from 10-24 years, single or married, with 3 to 10 completed years of education. The mean age at sexual debut was reported to be 18 years. Using the perspective of the Social Ecological Model, risky sexual behavior was associated with individual-level factors including young age at sexual debut, low education, unsafe sex under the influence of alcohol and substance abuse, multiple sexual partners or transactional sex. Family level factors included living away from parents, orphan status and low levels of family support. Peer and partner level factors included peer delinquency and lack of condom use. Low socioeconomic status at the society level was also associated with risky sexual behaviour. There is scant research on sexual and reproductive health of adolescent girls and young women in Cambodia. Individual, family and social factors were significantly associated with risky sexual behaviour. More research is required to inform potential preventive strategies and policies that address young women’s sexual and reproductive health.

Keywords: adolescents, high-risk sex, sexual activity, unplanned pregnancies

Procedia PDF Downloads 244
63 Impact of Increased Radiology Staffing on After-Hours Radiology Reporting Efficiency and Quality

Authors: Peregrine James Dalziel, Philip Vu Tran

Abstract:

Objective / Introduction: Demand for radiology services from Emergency Departments (ED) continues to increase with greater demands placed on radiology staff providing reports for the management of complex cases. Queuing theory indicates that wide variability of process time with the random nature of request arrival increases the probability of significant queues. This can lead to delays in the time-to-availability of radiology reports (TTA-RR) and potentially impaired ED patient flow. In addition, greater “cognitive workload” of greater volume may lead to reduced productivity and increased errors. We sought to quantify the potential ED flow improvements obtainable from increased radiology providers serving 3 public hospitals in Melbourne Australia. We sought to assess the potential productivity gains, quality improvement and the cost-effectiveness of increased labor inputs. Methods & Materials: The Western Health Medical Imaging Department moved from single resident coverage on weekend days 8:30 am-10:30 pm to a limited period of 2 resident coverage 1 pm-6 pm on both weekend days. The TTA-RR for weekend CT scans was calculated from the PACs database for the 8 month period symmetrically around the date of staffing change. A multivariate linear regression model was developed to isolate the improvement in TTA-RR, between the two 4-months periods. Daily and hourly scan volume at the time of each CT scan was calculated to assess the impact of varying department workload. To assess any improvement in report quality/errors a random sample of 200 studies was assessed to compare the average number of clinically significant over-read addendums to reports between the 2 periods. Cost-effectiveness was assessed by comparing the marginal cost of additional staffing against a conservative estimate of the economic benefit of improved ED patient throughput using the Australian national insurance rebate for private ED attendance as a revenue proxy. Results: The primary resident on call and the type of scan accounted for most of the explained variability in time to report availability (R2=0.29). Increasing daily volume and hourly volume was associated with increased TTA-RR (1.5m (p<0.01) and 4.8m (p<0.01) respectively per additional scan ordered within each time frame. Reports were available 25.9 minutes sooner on average in the 4 months post-implementation of double coverage (p<0.01) with additional 23.6 minutes improvement when 2 residents were on-site concomitantly (p<0.01). The aggregate average improvement in TTA-RR was 24.8 hours per weekend day This represents the increased decision-making time available to ED physicians and potential improvement in ED bed utilisation. 5% of reports from the intervention period contained clinically significant addendums vs 7% in the single resident period but this was not statistically significant (p=0.7). The marginal cost was less than the anticipated economic benefit based assuming a 50% capture of improved TTA-RR inpatient disposition and using the lowest available national insurance rebate as a proxy for economic benefit. Conclusion: TTA-RR improved significantly during the period of increased staff availability, both during the specific period of increased staffing and throughout the day. Increased labor utilisation is cost-effective compared with the potential improved productivity for ED cases requiring CT imaging.

Keywords: workflow, quality, administration, CT, staffing

Procedia PDF Downloads 112
62 Shared Versus Pooled Automated Vehicles: Exploring Behavioral Intentions Towards On-Demand Automated Vehicles

Authors: Samira Hamiditehrani

Abstract:

Automated vehicles (AVs) are emerging technologies that could potentially offer a wide range of opportunities and challenges for the transportation sector. The advent of AV technology has also resulted in new business models in shared mobility services where many ride hailing and car sharing companies are developing on-demand AVs including shared automated vehicles (SAVs) and pooled automated vehicles (Pooled AVs). SAVs and Pooled AVs could provide alternative shared mobility services which encourage sustainable transport systems, mitigate traffic congestion, and reduce automobile dependency. However, the success of on-demand AVs in addressing major transportation policy issues depends on whether and how the public adopts them as regular travel modes. To identify conditions under which individuals may adopt on-demand AVs, previous studies have applied human behavior and technology acceptance theories, where Theory of Planned Behavior (TPB) has been validated and is among the most tested in on-demand AV research. In this respect, this study has three objectives: (a) to propose and validate a theoretical model for behavioral intention to use SAVs and Pooled AVs by extending the original TPB model; (b) to identify the characteristics of early adopters of SAVs, who prefer to have a shorter and private ride, versus prospective users of Pooled AVs, who choose more affordable but longer and shared trips; and (c) to investigate Canadians’ intentions to adopt on-demand AVs for regular trips. Toward this end, this study uses data from an online survey (n = 3,622) of workers or adult students (18 to 75 years old) conducted in October and November 2021 for six major Canadian metropolitan areas: Toronto, Vancouver, Ottawa, Montreal, Calgary, and Hamilton. To accomplish the goals of this study, a base bivariate ordered probit model, in which both SAV and Pooled AV adoptions are estimated as ordered dependent variables, alongside a full structural equation modeling (SEM) system are estimated. The findings of this study indicate that affective motivations such as attitude towards AV technology, perceived privacy, and subjective norms, matter more than sociodemographic and travel behavior characteristic in adopting on-demand AVs. Also, the results of second objective provide evidence that although there are a few affective motivations, such as subjective norms and having ample knowledge, that are common between early adopters of SAVs and PooledAVs, many examined motivations differ among SAV and Pooled AV adoption factors. In other words, motivations influencing intention to use on-demand AVs differ among the service types. Likewise, depending on the types of on-demand AVs, the sociodemographic characteristics of early adopters differ significantly. In general, findings paint a complex picture with respect to the application of constructs from common technology adoption models to the study of on-demand AVs. Findings from the final objective suggest that policymakers, planners, the vehicle and technology industries, and the public at large should moderate their expectations that on-demand AVs may suddenly transform the entire transportation sector. Instead, this study suggests that SAVs and Pooled AVs (when they entire the Canadian market) are likely to be adopted as supplementary mobility tools rather than substitutions for current travel modes

Keywords: automated vehicles, Canadian perception, theory of planned behavior, on-demand AVs

Procedia PDF Downloads 72
61 The Negative Effects of Controlled Motivation on Mathematics Achievement

Authors: John E. Boberg, Steven J. Bourgeois

Abstract:

The decline in student engagement and motivation through the middle years is well documented and clearly associated with a decline in mathematics achievement that persists through high school. To combat this trend and, very often, to meet high-stakes accountability standards, a growing number of parents, teachers, and schools have implemented various methods to incentivize learning. However, according to Self-Determination Theory, forms of incentivized learning such as public praise, tangible rewards, or threats of punishment tend to undermine intrinsic motivation and learning. By focusing on external forms of motivation that thwart autonomy in children, adults also potentially threaten relatedness measures such as trust and emotional engagement. Furthermore, these controlling motivational techniques tend to promote shallow forms of cognitive engagement at the expense of more effective deep processing strategies. Therefore, any short-term gains in apparent engagement or test scores are overshadowed by long-term diminished motivation, resulting in inauthentic approaches to learning and lower achievement. The current study focuses on the relationships between student trust, engagement, and motivation during these crucial years as students transition from elementary to middle school. In order to test the effects of controlled motivational techniques on achievement in mathematics, this quantitative study was conducted on a convenience sample of 22 elementary and middle schools from a single public charter school district in the south-central United States. The study employed multi-source data from students (N = 1,054), parents (N = 7,166), and teachers (N = 356), along with student achievement data and contextual campus variables. Cross-sectional questionnaires were used to measure the students’ self-regulated learning, emotional and cognitive engagement, and trust in teachers. Parents responded to a single item on incentivizing the academic performance of their child, and teachers responded to a series of questions about their acceptance of various incentive strategies. Structural equation modeling (SEM) was used to evaluate model fit and analyze the direct and indirect effects of the predictor variables on achievement. Although a student’s trust in teacher positively predicted both emotional and cognitive engagement, none of these three predictors accounted for any variance in achievement in mathematics. The parents’ use of incentives, on the other hand, predicted a student’s perception of his or her controlled motivation, and these two variables had significant negative effects on achievement. While controlled motivation had the greatest effects on achievement, parental incentives demonstrated both direct and indirect effects on achievement through the students’ self-reported controlled motivation. Comparing upper elementary student data with middle-school student data revealed that controlling forms of motivation may be taking their toll on student trust and engagement over time. While parental incentives positively predicted both cognitive and emotional engagement in the younger sub-group, such forms of controlling motivation negatively predicted both trust in teachers and emotional engagement in the middle-school sub-group. These findings support the claims, posited by Self-Determination Theory, about the dangers of incentivizing learning. Short-term gains belie the underlying damage to motivational processes that lead to decreased intrinsic motivation and achievement. Such practices also appear to thwart basic human needs such as relatedness.

Keywords: controlled motivation, student engagement, incentivized learning, mathematics achievement, self-determination theory, student trust

Procedia PDF Downloads 218
60 Theoretical Modelling of Molecular Mechanisms in Stimuli-Responsive Polymers

Authors: Catherine Vasnetsov, Victor Vasnetsov

Abstract:

Context: Thermo-responsive polymers are materials that undergo significant changes in their physical properties in response to temperature changes. These polymers have gained significant attention in research due to their potential applications in various industries and medicine. However, the molecular mechanisms underlying their behavior are not well understood, particularly in relation to cosolvency, which is crucial for practical applications. Research Aim: This study aimed to theoretically investigate the phenomenon of cosolvency in long-chain polymers using the Flory-Huggins statistical-mechanical framework. The main objective was to understand the interactions between the polymer, solvent, and cosolvent under different conditions. Methodology: The research employed a combination of Monte Carlo computer simulations and advanced machine-learning methods. The Flory-Huggins mean field theory was used as the basis for the simulations. Spinodal graphs and ternary plots were utilized to develop an initial computer model for predicting polymer behavior. Molecular dynamic simulations were conducted to mimic real-life polymer systems. Machine learning techniques were incorporated to enhance the accuracy and reliability of the simulations. Findings: The simulations revealed that the addition of very low or very high volumes of cosolvent molecules resulted in smaller radii of gyration for the polymer, indicating poor miscibility. However, intermediate volume fractions of cosolvent led to higher radii of gyration, suggesting improved miscibility. These findings provide a possible microscopic explanation for the cosolvency phenomenon in polymer systems. Theoretical Importance: This research contributes to a better understanding of the behavior of thermo-responsive polymers and the role of cosolvency. The findings provide insights into the molecular mechanisms underlying cosolvency and offer specific predictions for future experimental investigations. The study also presents a more rigorous analysis of the Flory-Huggins free energy theory in the context of polymer systems. Data Collection and Analysis Procedures: The data for this study was collected through Monte Carlo computer simulations and molecular dynamic simulations. The interactions between the polymer, solvent, and cosolvent were analyzed using the Flory-Huggins mean field theory. Machine learning techniques were employed to enhance the accuracy of the simulations. The collected data was then analyzed to determine the impact of cosolvent volume fractions on the radii of gyration of the polymer. Question Addressed: The research addressed the question of how cosolvency affects the behavior of long-chain polymers. Specifically, the study aimed to investigate the interactions between the polymer, solvent, and cosolvent under different volume fractions and understand the resulting changes in the radii of gyration. Conclusion: In conclusion, this study utilized theoretical modeling and computer simulations to investigate the phenomenon of cosolvency in long-chain polymers. The findings suggest that moderate cosolvent volume fractions can lead to improved miscibility, as indicated by higher radii of gyration. These insights contribute to a better understanding of the molecular mechanisms underlying cosolvency in polymer systems and provide predictions for future experimental studies. The research also enhances the theoretical analysis of the Flory-Huggins free energy theory.

Keywords: molecular modelling, flory-huggins, cosolvency, stimuli-responsive polymers

Procedia PDF Downloads 68
59 Empowering and Educating Young People Against Cybercrime by Playing: The Rayuela Method

Authors: Jose L. Diego, Antonio Berlanga, Gregorio López, Diana López

Abstract:

The Rayuela method is a success story, as it is part of a project selected by the European Commission to face the challenge launched by itself for achieving a better understanding of human factors, as well as social and organisational aspects that are able to solve issues in fighting against crime. Rayuela's method specifically focuses on the drivers of cyber criminality, including approaches to prevent, investigate, and mitigate cybercriminal behavior. As the internet has become an integral part of young people’s lives, they are the key target of the Rayuela method because they (as a victim or as a perpetrator) are the most vulnerable link of the chain. Considering the increased time spent online and the control of their internet usage and the low level of awareness of cyber threats and their potential impact, it is understandable the proliferation of incidents due to human mistakes. 51% of Europeans feel not well informed about cyber threats, and 86% believe that the risk of becoming a victim of cybercrime is rapidly increasing. On the other hand, Law enforcement has noted that more and more young people are increasingly committing cybercrimes. This is an international problem that has considerable cost implications; it is estimated that crimes in cyberspace will cost the global economy $445B annually. Understanding all these phenomena drives to the necessity of a shift in focus from sanctions to deterrence and prevention. As a research project, Rayuela aims to bring together law enforcement agencies (LEAs), sociologists, psychologists, anthropologists, legal experts, computer scientists, and engineers, to develop novel methodologies that allow better understanding the factors affecting online behavior related to new ways of cyber criminality, as well as promoting the potential of these young talents for cybersecurity and technologies. Rayuela’s main goal is to better understand the drivers and human factors affecting certain relevant ways of cyber criminality, as well as empower and educate young people in the benefits, risks, and threats intrinsically linked to the use of the Internet by playing, thus preventing and mitigating cybercriminal behavior. In order to reach that goal it´s necessary an interdisciplinary consortium (formed by 17 international partners) carries out researches and actions like Profiling and case studies of cybercriminals and victims, risk assessments, studies on Internet of Things and its vulnerabilities, development of a serious gaming environment, training activities, data analysis and interpretation using Artificial intelligence, testing and piloting, etc. For facilitating the real implementation of the Rayuela method, as a community policing strategy, is crucial to count on a Police Force with a solid background in trust-building and community policing in order to do the piloting, specifically with young people. In this sense, Valencia Local Police is a pioneer Police Force working with young people in conflict solving, through providing police mediation and peer mediation services and advice. As an example, it is an official mediation institution, so agreements signed by their police mediators have once signed by the parties, the value of a judicial decision.

Keywords: fight against crime and insecurity, avert and prepare young people against aggression, ICT, serious gaming and artificial intelligence against cybercrime, conflict solving and mediation with young people

Procedia PDF Downloads 127
58 Fuzzy Data, Random Drift, and a Theoretical Model for the Sequential Emergence of Religious Capacity in Genus Homo

Authors: Margaret Boone Rappaport, Christopher J. Corbally

Abstract:

The ancient ape ancestral population from which living great ape and human species evolved had demographic features affecting their evolution. The population was large, had great genetic variability, and natural selection was effective at honing adaptations. The emerging populations of chimpanzees and humans were affected more by founder effects and genetic drift because they were smaller. Natural selection did not disappear, but it was not as strong. Consequences of the 'population crash' and the human effective population size are introduced briefly. The history of the ancient apes is written in the genomes of living humans and great apes. The expansion of the brain began before the human line emerged. Coalescence times for some genes are very old – up to several million years, long before Homo sapiens. The mismatch between gene trees and species trees highlights the anthropoid speciation processes, and gives the human genome history a fuzzy, probabilistic quality. However, it suggests traits that might form a foundation for capacities emerging later. A theoretical model is presented in which the genomes of early ape populations provide the substructure for the emergence of religious capacity later on the human line. The model does not search for religion, but its foundations. It suggests a course by which an evolutionary line that began with prosimians eventually produced a human species with biologically based religious capacity. The model of the sequential emergence of religious capacity relies on cognitive science, neuroscience, paleoneurology, primate field studies, cognitive archaeology, genomics, and population genetics. And, it emphasizes five trait types: (1) Documented, positive selection of sensory capabilities on the human line may have favored survival, but also eventually enriched human religious experience. (2) The bonobo model suggests a possible down-regulation of aggression and increase in tolerance while feeding, as well as paedomorphism – but, in a human species that remains cognitively sharp (unlike the bonobo). The two species emerged from the same ancient ape population, so it is logical to search for shared traits. (3) An up-regulation of emotional sensitivity and compassion seems to have occurred on the human line. This finds support in modern genetic studies. (4) The authors’ published model of morality's emergence in Homo erectus encompasses a cognitively based, decision-making capacity that was hypothetically overtaken, in part, by religious capacity. Together, they produced a strong, variable, biocultural capability to support human sociability. (5) The full flowering of human religious capacity came with the parietal expansion and smaller face (klinorhynchy) found only in Homo sapiens. Details from paleoneurology suggest the stage was set for human theologies. Larger parietal lobes allowed humans to imagine inner spaces, processes, and beings, and, with the frontal lobe, led to the first theologies composed of structured and integrated theories of the relationships between humans and the supernatural. The model leads to the evolution of a small population of African hominins that was ready to emerge with religious capacity when the species Homo sapiens evolved two hundred thousand years ago. By 50-60,000 years ago, when human ancestors left Africa, they were fully enabled.

Keywords: genetic drift, genomics, parietal expansion, religious capacity

Procedia PDF Downloads 341
57 Optimizing Machine Learning Algorithms for Defect Characterization and Elimination in Liquids Manufacturing

Authors: Tolulope Aremu

Abstract:

The key process steps to produce liquid detergent products will introduce potential defects, such as formulation, mixing, filling, and packaging, which might compromise product quality, consumer safety, and operational efficiency. Real-time identification and characterization of such defects are of prime importance for maintaining high standards and reducing waste and costs. Usually, defect detection is performed by human inspection or rule-based systems, which is very time-consuming, inconsistent, and error-prone. The present study overcomes these limitations in dealing with optimization in defect characterization within the process for making liquid detergents using Machine Learning algorithms. Performance testing of various machine learning models was carried out: Support Vector Machine, Decision Trees, Random Forest, and Convolutional Neural Network on defect detection and classification of those defects like wrong viscosity, color deviations, improper filling of a bottle, packaging anomalies. These algorithms have significantly benefited from a variety of optimization techniques, including hyperparameter tuning and ensemble learning, in order to greatly improve detection accuracy while minimizing false positives. Equipped with a rich dataset of defect types and production parameters consisting of more than 100,000 samples, our study further includes information from real-time sensor data, imaging technologies, and historic production records. The results are that optimized machine learning models significantly improve defect detection compared to traditional methods. Take, for instance, the CNNs, which run at 98% and 96% accuracy in detecting packaging anomaly detection and bottle filling inconsistency, respectively, by fine-tuning the model with real-time imaging data, through which there was a reduction in false positives of about 30%. The optimized SVM model on detecting formulation defects gave 94% in viscosity variation detection and color variation. These values of performance metrics correspond to a giant leap in defect detection accuracy compared to the usual 80% level achieved up to now by rule-based systems. Moreover, this optimization with models can hasten defect characterization, allowing for detection time to be below 15 seconds from an average of 3 minutes using manual inspections with real-time processing of data. With this, the reduction in time will be combined with a 25% reduction in production downtime because of proactive defect identification, which can save millions annually in recall and rework costs. Integrating real-time machine learning-driven monitoring drives predictive maintenance and corrective measures for a 20% improvement in overall production efficiency. Therefore, the optimization of machine learning algorithms in defect characterization optimum scalability and efficiency for liquid detergent companies gives improved operational performance to higher levels of product quality. In general, this method could be conducted in several industries within the Fast moving consumer Goods industry, which would lead to an improved quality control process.

Keywords: liquid detergent manufacturing, defect detection, machine learning, support vector machines, convolutional neural networks, defect characterization, predictive maintenance, quality control, fast-moving consumer goods

Procedia PDF Downloads 16
56 Numerical Modeling of Phase Change Materials Walls under Reunion Island's Tropical Weather

Authors: Lionel Trovalet, Lisa Liu, Dimitri Bigot, Nadia Hammami, Jean-Pierre Habas, Bruno Malet-Damour

Abstract:

The MCP-iBAT1 project is carried out to study the behavior of Phase Change Materials (PCM) integrated in building envelopes in a tropical environment. Through the phase transitions (melting and freezing) of the material, thermal energy can be absorbed or released. This process enables the regulation of indoor temperatures and the improvement of thermal comfort for the occupants. Most of the commercially available PCMs are more suitable to temperate climates than to tropical climates. The case of Reunion Island is noteworthy as there are multiple micro-climates. This leads to our key question: developing one or multiple bio-based PCMs that cover the thermal needs of the different locations of the island. The present paper focuses on the numerical approach to select the PCM properties relevant to tropical areas. Numerical simulations have been carried out with two softwares: EnergyPlusTM and Isolab. The latter has been developed in the laboratory, with the implicit Finite Difference Method, in order to evaluate different physical models. Both are Thermal Dynamic Simulation (TDS) softwares that predict the building’s thermal behavior with one-dimensional heat transfers. The parameters used in this study are the construction’s characteristics (dimensions and materials) and the environment’s description (meteorological data and building surroundings). The building is modeled in accordance with the experimental setup. It is divided into two rooms, cells A and B, with same dimensions. Cell A is the reference, while in cell B, a layer of commercial PCM (Thermo Confort of MCI Technologies) has been applied to the inner surface of the North wall. Sensors are installed in each room to retrieve temperatures, heat flows, and humidity rates. The collected data are used for the comparison with the numerical results. Our strategy is to implement two similar buildings at different altitudes (Saint-Pierre: 70m and Le Tampon: 520m) to measure different temperature ranges. Therefore, we are able to collect data for various seasons during a condensed time period. The following methodology is used to validate the numerical models: calibration of the thermal and PCM models in EnergyPlusTM and Isolab based on experimental measures, then numerical testing with a sensitivity analysis of the parameters to reach the targeted indoor temperatures. The calibration relies on the past ten months’ measures (from September 2020 to June 2021), with a focus on one-week study on November (beginning of summer) when the effect of PCM on inner surface temperatures is more visible. A first simulation with the PCM model of EnergyPlus gave results approaching the measurements with a mean error of 5%. The studied property in this paper is the melting temperature of the PCM. By determining the representative temperature of winter, summer and inter-seasons with past annual’s weather data, it is possible to build a numerical model of multi-layered PCM. Hence, the combined properties of the materials will provide an optimal scenario for the application on PCM in tropical areas. Future works will focus on the development of bio-based PCMs with the selected properties followed by experimental and numerical validation of the materials. 1Materiaux ´ a Changement de Phase, une innovation pour le B ` ati Tropical

Keywords: energyplus, multi-layer of PCM, phase changing materials, tropical area

Procedia PDF Downloads 93
55 Sustainable Housing and Urban Development: A Study on the Soon-To-Be-Old Population's Impetus to Migrate

Authors: Tristance Kee

Abstract:

With the unprecedented increase in elderly population globally, it is critical to search for new sustainable housing and urban development alternatives to traditional housing options. This research examines concepts of elderly migration pattern in the context of a high density city in Hong Kong to Mainland China. The research objectives are to: 1) explore the relationships between soon-to-be-old elderly and their intentions to move to Mainland upon retirement and their demographic characteristics; and 2) What are the desired amenities, locational factors and activities that are expected in the soon-to-be-old generation’s retirement housing environment? Primary data was collected through questionnaire survey conducted using random sampling method with respondents aged between 45-64 years old. The face-to-face survey was completed by 500 respondents. The survey was divided into four sections. The first section focused on respondent’s demographic information such as gender, age, education attainment, monthly income, housing tenure type and their visits to Mainland China. The second section focused on their retirement plans in terms of intended retirement age, prospective retirement funding and retirement housing options. The third section focused on the respondent’s attitudes toward retiring in Mainland for housing. It asked about their intentions to migrate retire into Mainland and incentives to retire in Hong Kong. The fourth section focused on respondent’s ideal housing environment including preferred housing amenities, desired living environment and retirement activities. The dependent variable in this study was ‘respondent’s consideration to move to Mainland China upon retirement’. Eight primary independent variables were integrated into the study to identify the correlations between them and retirement migration plan. The independent variables include: gender, age, marital status, monthly income, present housing tenure type, property ownership in Hong Kong, relationship with Mainland and the frequency of visiting Mainland China. In addition to the above independent variables, respondents were asked to indicate their retirement plans (retirement age, funding sources and retirement housing options), incentives to migrate to retire (choices included: property ownership, family relations, cost of living, living environment, medical facilities, government welfare benefits, etc.), perceived ideal retirement life qualities including desired amenities (sports, medical and leisure facilities etc.), desired locational qualities (green open space, convenient transport options and accessibility to urban settings etc.) and desired retirement activities (home-based leisure, elderly friendly sports, cultural activities, child care, social activities, etc.). The finding shows correlations between the used independent variables and consideration to migrate for housing options. The two independent variables indicated a possible correlation were gender and the frequency of visiting Mainland at present. When considering the increasing property prices across the border and strong social relationships, potential retirement migration is a very subjective decision that could vary from person to person. This research adds knowledge to housing research and migration study. Although the research is based in Mainland, most of the characteristics identified including better medical services, government welfare and sound urban amenities are shared qualities for all sustainable urban development and housing strategies.

Keywords: elderly migration, housing alternative, soon-to-be-old, sustainable environment

Procedia PDF Downloads 209
54 Sustainability Framework for Water Management in New Zealand's Canterbury Region

Authors: Bryan Jenkins

Abstract:

Introduction: The expansion of irrigation in the Canterbury region has led to the sustainability limits being reached for water availability and the cumulative effects of land use intensification. The institutional framework under New Zealand’s Resource Management Act was found to be an inadequate basis for managing water at sustainability limits. An alternative paradigm for water management was developed based on collaborative governance and nested adaptive systems. This led to the formulation and implementation of the Canterbury Water Management Strategy. Methods: The nested adaptive system approach was adopted. Sustainability issues were identified at multiple spatial and time scales and defined potential failure pathways for the water resource system. These included biophysical and socio-economic issues such as water availability, cumulative effects on water quality due to land use intensification, projected changes in climate, public health, institutional arrangements, economic outcomes and externalities, and, social effects of changing technology. This led to the derivation of sustainability strategies to address these failure pathways. The collaborative governance approach involved stakeholder participation and community engagement to decide on a regional strategy; regional and zone committees of community and rūnanga (Māori groups) members to develop implementation programmes for the strategy; and, farmer collectives for operational management. Findings: The strategy identified improvements in the efficiency of use of water already allocated was more effective in improving water availability than a reliance on increased storage alone. New forms of storage with less adverse impacts were introduced, such as managed aquifer recharge and off-river storage. Reductions of nutrients from land use intensification by improving management practices has been a priority. Solutions packages for addressing the degradation of vulnerable lakes and rivers have been prepared. Biodiversity enhancement projects have been initiated. Greater involvement of Māori has led to the incorporation of kaitiakitanga (resource stewardship) into implementation programmes. Emerging issues are the need for improved integration of surface water and groundwater interactions, increased use of modelling of water and financial outcomes to guide decision making, and, equity in allocation among existing users as well as between existing and future users. Conclusions: However, sustainability analysis indicates that the proposed levels of management interventions are not sufficient to achieve community targets for water management. There is a need for more proactive recovery and rehabilitation measures. Managing to environmental limits is not sufficient, rather managing adaptive cycles is needed. Better measurement and management of water use efficiency is required. Proposed implementation packages are not sufficient to deliver desired water quality outcomes. Greater attention to targets important to environmental and recreational interests is needed to maintain trust in the collaborative process. Implementation programmes don’t adequately address climate change adaptations and greenhouse gas mitigation. Affordability is a constraint on adaptive capacity of farmers and communities. More funding mechanisms are required to implement proactive measures. The legislative and institutional framework needs to be changed to incorporate water framework legislation, regional sustainability strategies and water infrastructure coordination.

Keywords: collaborative governance, irrigation management, nested adaptive systems, sustainable water management

Procedia PDF Downloads 158
53 Implementation of Autologous Adipose Graft from the Abdomen for Complete Fat Pad Loss of the Heel Following a Traumatic Open Fracture Secondary to a Motor Vehicle Accident: A Case Study

Authors: Ahmad Saad, Shuja Abbas, Breanna Marine

Abstract:

Introduction: This study explores the potential applications of autologous pedal fat pad grafting as a minimally invasive therapeutic strategy for addressing pedal fat pad loss. Without adequate shock absorbing tissue, a patient can experience functional deficits, ulcerations, loss of quality of life, and significant limitations with ambulation. This study details a novel technique involving autologous adipose grafting from the abdomen to enhance plantar fat pad thickness in a patient involved in a severe motor vehicle accident which resulted in total fat pad loss of the heel. Autologous adipose grafting (AAG) was used following adipose allografting in an effort to recreate a normal shock absorbing surface to allow return to activities of daily living and painless ambulation. Methods: A 46-year-old male sustained multiple open pedal fractures and necrosis to the heel fat pad after a motorcycle accident, which resulted in complete loss of the calcaneal fat pad. The patient underwent serial debridement’s, utilization of wound vac therapy and split thickness skin grafting to accomplish complete closure, despite complete loss of adipose to area. Patient presented with complaints of pain on ambulation, inability to bear weight on the heel, recurrent ulcerations, admitted had not been ambulating for two years. Clinical exam demonstrated complete loss of the plantar fat pad with a thin layer of epithelial tissue overlying the calcaneal bone, allowing visibility of the osseous contour of the calcaneus. Scar tissue had formed in place of the fat pad, with thickened epithelial tissue extending from the midfoot to the calcaneus. After conservative measures were exhausted, the patient opted for initial management by adipose allograft matrix (AAM) injections. Post operative X-ray imaging revealed noticeable improvement in calcaneal fat pad thickness. At 1 year follow up, the patient was able to ambulate without assistive devices. The fat pad at this point was significantly thicker than it was pre-operatively, but the thickness did not restore to pre-accident thickness. In order to compare the take of allograft versus autografting of adipose tissue, the decision to use adipose autograft through abdominal liposuction harvesting was deemed suitable. A general surgeon completed harvesting of adipose cells from the patient’s abdomen via liposuction, and a podiatric surgeon performed the AAG injection into the heel. Total of 15 cc’s of autologous adipose tissue injected to the calcaneus. Results: There was a visual increase in the calcaneal fat pad thickness both clinically and radiographically. At the 6-week follow up, imaging revealed retention of the calcaneal fat pad thickness. Three months postop, patient returned to activities of daily living and increased quality of life due to their increased ability to ambulate. Discussion: AAG is a novel treatment for pedal fat pad loss. These treatments may be viable and reproducible therapeutic choices for patients suffering from fat pad atrophy, fat pad loss, and/or plantar ulcerations. Both treatments of AAM and AAG exhibited similar therapeutic results by providing pain relief for ambulation and allowing for patients to return to their quality of life.

Keywords: podiatry, wound, adipose, allograft, autograft, wound care, limb reconstruction, injection, limb salvage

Procedia PDF Downloads 81
52 Analytical Model of Locomotion of a Thin-Film Piezoelectric 2D Soft Robot Including Gravity Effects

Authors: Zhiwu Zheng, Prakhar Kumar, Sigurd Wagner, Naveen Verma, James C. Sturm

Abstract:

Soft robots have drawn great interest recently due to a rich range of possible shapes and motions they can take on to address new applications, compared to traditional rigid robots. Large-area electronics (LAE) provides a unique platform for creating soft robots by leveraging thin-film technology to enable the integration of a large number of actuators, sensors, and control circuits on flexible sheets. However, the rich shapes and motions possible, especially when interacting with complex environments, pose significant challenges to forming well-generalized and robust models necessary for robot design and control. In this work, we describe an analytical model for predicting the shape and locomotion of a flexible (steel-foil-based) piezoelectric-actuated 2D robot based on Euler-Bernoulli beam theory. It is nominally (unpowered) lying flat on the ground, and when powered, its shape is controlled by an array of piezoelectric thin-film actuators. Key features of the models are its ability to incorporate the significant effects of gravity on the shape and to precisely predict the spatial distribution of friction against the contacting surfaces, necessary for determining inchworm-type motion. We verified the model by developing a distributed discrete element representation of a continuous piezoelectric actuator and by comparing its analytical predictions to discrete-element robot simulations using PyBullet. Without gravity, predicting the shape of a sheet with a linear array of piezoelectric actuators at arbitrary voltages is straightforward. However, gravity significantly distorts the shape of the sheet, causing some segments to flatten against the ground. Our work includes the following contributions: (i) A self-consistent approach was developed to exactly determine which parts of the soft robot are lifted off the ground, and the exact shape of these sections, for an arbitrary array of piezoelectric voltages and configurations. (ii) Inchworm-type motion relies on controlling the relative friction with the ground surface in different sections of the robot. By adding torque-balance to our model and analyzing shear forces, the model can then determine the exact spatial distribution of the vertical force that the ground is exerting on the soft robot. Through this, the spatial distribution of friction forces between ground and robot can be determined. (iii) By combining this spatial friction distribution with the shape of the soft robot, in the function of time as piezoelectric actuator voltages are changed, the inchworm-type locomotion of the robot can be determined. As a practical example, we calculated the performance of a 5-actuator system on a 50-µm thick steel foil. Piezoelectric properties of commercially available thin-film piezoelectric actuators were assumed. The model predicted inchworm motion of up to 200 µm per step. For independent verification, we also modelled the system using PyBullet, a discrete-element robot simulator. To model a continuous thin-film piezoelectric actuator, we broke each actuator into multiple segments, each of which consisted of two rigid arms with appropriate mass connected with a 'motor' whose torque was set by the applied actuator voltage. Excellent agreement between our analytical model and the discrete-element simulator was shown for both for the full deformation shape and motion of the robot.

Keywords: analytical modeling, piezoelectric actuators, soft robot locomotion, thin-film technology

Procedia PDF Downloads 176
51 ChatGPT 4.0 Demonstrates Strong Performance in Standardised Medical Licensing Examinations: Insights and Implications for Medical Educators

Authors: K. O'Malley

Abstract:

Background: The emergence and rapid evolution of large language models (LLMs) (i.e., models of generative artificial intelligence, or AI) has been unprecedented. ChatGPT is one of the most widely used LLM platforms. Using natural language processing technology, it generates customized responses to user prompts, enabling it to mimic human conversation. Responses are generated using predictive modeling of vast internet text and data swathes and are further refined and reinforced through user feedback. The popularity of LLMs is increasing, with a growing number of students utilizing these platforms for study and revision purposes. Notwithstanding its many novel applications, LLM technology is inherently susceptible to bias and error. This poses a significant challenge in the educational setting, where academic integrity may be undermined. This study aims to evaluate the performance of the latest iteration of ChatGPT (ChatGPT4.0) in standardized state medical licensing examinations. Methods: A considered search strategy was used to interrogate the PubMed electronic database. The keywords ‘ChatGPT’ AND ‘medical education’ OR ‘medical school’ OR ‘medical licensing exam’ were used to identify relevant literature. The search included all peer-reviewed literature published in the past five years. The search was limited to publications in the English language only. Eligibility was ascertained based on the study title and abstract and confirmed by consulting the full-text document. Data was extracted into a Microsoft Excel document for analysis. Results: The search yielded 345 publications that were screened. 225 original articles were identified, of which 11 met the pre-determined criteria for inclusion in a narrative synthesis. These studies included performance assessments in national medical licensing examinations from the United States, United Kingdom, Saudi Arabia, Poland, Taiwan, Japan and Germany. ChatGPT 4.0 achieved scores ranging from 67.1 to 88.6 percent. The mean score across all studies was 82.49 percent (SD= 5.95). In all studies, ChatGPT exceeded the threshold for a passing grade in the corresponding exam. Conclusion: The capabilities of ChatGPT in standardized academic assessment in medicine are robust. While this technology can potentially revolutionize higher education, it also presents several challenges with which educators have not had to contend before. The overall strong performance of ChatGPT, as outlined above, may lend itself to unfair use (such as the plagiarism of deliverable coursework) and pose unforeseen ethical challenges (arising from algorithmic bias). Conversely, it highlights potential pitfalls if users assume LLM-generated content to be entirely accurate. In the aforementioned studies, ChatGPT exhibits a margin of error between 11.4 and 32.9 percent, which resonates strongly with concerns regarding the quality and veracity of LLM-generated content. It is imperative to highlight these limitations, particularly to students in the early stages of their education who are less likely to possess the requisite insight or knowledge to recognize errors, inaccuracies or false information. Educators must inform themselves of these emerging challenges to effectively address them and mitigate potential disruption in academic fora.

Keywords: artificial intelligence, ChatGPT, generative ai, large language models, licensing exam, medical education, medicine, university

Procedia PDF Downloads 28