Search results for: psychosocial approach
9472 Natural Convection of a Nanofluid in a Conical Container
Authors: Brahim Mahfoud, Ali Bendjaghlouli
Abstract:
Natural convection is simulated in a truncated cone filled with nanofluid. Inclined and top walls have constant temperature where the heat source is located on the bottom wall of the conical container which is thermally insulated. A finite volume approach is used to solve the governing equations using the SIMPLE algorithm for different parameters such as Rayleigh number, inclination angle of inclined walls of the enclosure and heat source length. The results showed an enhancement in cooling system by using a nanofluid, when conduction regime is assisted. The inclination angle of inclined sidewall and heat source length affect the heat transfer rate and the maximum temperature.Keywords: heat source, truncated cone, nanofluid, natural convection
Procedia PDF Downloads 3689471 An Incremental Refinement Approach to a Development of Dynamic Host Configuration Protocol (DHCP) Using Event-B
Authors: Rajaa Filali, Mohamed Bouhdadi
Abstract:
This paper presents an incremental development of the Dynamic Host Configuration Protocol (DHCP) in Event-B. DHCP is widely used communication protocol, which provides a standard mechanism to obtain configuration parameters. The specification is performed in a stepwise manner and verified through a series of refinements. The Event-B formal method uses the Rodin platform to modeling and verifying some properties of the protocol such as safety, liveness and deadlock freedom. To model and verify the protocol, we use the formal technique Event-B which provides an accessible and rigorous development method. This interaction between modelling and proving reduces the complexity and helps to eliminate misunderstandings, inconsistencies, and specification gaps.Keywords: DHCP protocol, Event-B, refinement, proof obligation, Rodin
Procedia PDF Downloads 2279470 Deep Learning for Image Correction in Sparse-View Computed Tomography
Authors: Shubham Gogri, Lucia Florescu
Abstract:
Medical diagnosis and radiotherapy treatment planning using Computed Tomography (CT) rely on the quantitative accuracy and quality of the CT images. At the same time, requirements for CT imaging include reducing the radiation dose exposure to patients and minimizing scanning time. A solution to this is the sparse-view CT technique, based on a reduced number of projection views. This, however, introduces a new problem— the incomplete projection data results in lower quality of the reconstructed images. To tackle this issue, deep learning methods have been applied to enhance the quality of the sparse-view CT images. A first approach involved employing Mir-Net, a dedicated deep neural network designed for image enhancement. This showed promise, utilizing an intricate architecture comprising encoder and decoder networks, along with the incorporation of the Charbonnier Loss. However, this approach was computationally demanding. Subsequently, a specialized Generative Adversarial Network (GAN) architecture, rooted in the Pix2Pix framework, was implemented. This GAN framework involves a U-Net-based Generator and a Discriminator based on Convolutional Neural Networks. To bolster the GAN's performance, both Charbonnier and Wasserstein loss functions were introduced, collectively focusing on capturing minute details while ensuring training stability. The integration of the perceptual loss, calculated based on feature vectors extracted from the VGG16 network pretrained on the ImageNet dataset, further enhanced the network's ability to synthesize relevant images. A series of comprehensive experiments with clinical CT data were conducted, exploring various GAN loss functions, including Wasserstein, Charbonnier, and perceptual loss. The outcomes demonstrated significant image quality improvements, confirmed through pertinent metrics such as Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) between the corrected images and the ground truth. Furthermore, learning curves and qualitative comparisons added evidence of the enhanced image quality and the network's increased stability, while preserving pixel value intensity. The experiments underscored the potential of deep learning frameworks in enhancing the visual interpretation of CT scans, achieving outcomes with SSIM values close to one and PSNR values reaching up to 76.Keywords: generative adversarial networks, sparse view computed tomography, CT image correction, Mir-Net
Procedia PDF Downloads 1619469 Voices of Youth: Contributing to Healthy Teens
Authors: Christa Beyers
Abstract:
Investing in the health of youth is essential for the well-being of society. If youth do not live a healthy life, the future of the global workforce and overall development of adolescents looks bleak given the challenges posed in this developmental stage. The idea of sexuality education at home and in our schools is a controversial and contentious subject, as many parents and teachers do not hold the same beliefs as to what content should be taught. Despite high incidence of HIV and STD infections, early school dropout and teen pregnancies, sexuality education has still not been given the recognition or importance it deserves. By giving youth a voice can lead to both behavioural and policy changes. This article is based on a literature review of sex and sexuality education from a social studies approach. This article argues that adults tend to teach from their own perspective, which does not meet the needs of youth, thereby ignoring the social aspects of sexual behaviour.Keywords: sexuality education, adolescents, communication, cycle of socialization
Procedia PDF Downloads 1989468 A Framework for Incorporating Non-Linear Degradation of Conductive Adhesive in Environmental Testing
Authors: Kedar Hardikar, Joe Varghese
Abstract:
Conductive adhesives have found wide-ranging applications in electronics industry ranging from fixing a defective conductor on printed circuit board (PCB) attaching an electronic component in an assembly to protecting electronics components by the formation of “Faraday Cage.” The reliability requirements for the conductive adhesive vary widely depending on the application and expected product lifetime. While the conductive adhesive is required to maintain the structural integrity, the electrical performance of the associated sub-assembly can be affected by the degradation of conductive adhesive. The degradation of the adhesive is dependent upon the highly varied use case. The conventional approach to assess the reliability of the sub-assembly involves subjecting it to the standard environmental test conditions such as high-temperature high humidity, thermal cycling, high-temperature exposure to name a few. In order to enable projection of test data and observed failures to predict field performance, systematic development of an acceleration factor between the test conditions and field conditions is crucial. Common acceleration factor models such as Arrhenius model are based on rate kinetics and typically rely on an assumption of linear degradation in time for a given condition and test duration. The application of interest in this work involves conductive adhesive used in an electronic circuit of a capacitive sensor. The degradation of conductive adhesive in high temperature and humidity environment is quantified by the capacitance values. Under such conditions, the use of established models such as Hallberg-Peck model or Eyring Model to predict time to failure in the field typically relies on linear degradation rate. In this particular case, it is seen that the degradation is nonlinear in time and exhibits a square root t dependence. It is also shown that for the mechanism of interest, the presence of moisture is essential, and the dominant mechanism driving the degradation is the diffusion of moisture. In this work, a framework is developed to incorporate nonlinear degradation of the conductive adhesive for the development of an acceleration factor. This method can be extended to applications where nonlinearity in degradation rate can be adequately characterized in tests. It is shown that depending on the expected product lifetime, the use of conventional linear degradation approach can overestimate or underestimate the field performance. This work provides guidelines for suitability of linear degradation approximation for such varied applicationsKeywords: conductive adhesives, nonlinear degradation, physics of failure, acceleration factor model.
Procedia PDF Downloads 1359467 Application of the Discrete Rationalized Haar Transform to Distributed Parameter System
Authors: Joon-Hoon Park
Abstract:
In this paper the rationalized Haar transform is applied for distributed parameter system identification and estimation. A distributed parameter system is a dynamical and mathematical model described by a partial differential equation. And system identification concerns the problem of determining mathematical models from observed data. The Haar function has some disadvantages of calculation because it contains irrational numbers, for these reasons the rationalized Haar function that has only rational numbers. The algorithm adopted in this paper is based on the transform and operational matrix of the rationalized Haar function. This approach provides more convenient and efficient computational results.Keywords: distributed parameter system, rationalized Haar transform, operational matrix, system identification
Procedia PDF Downloads 5099466 Investigation on the Acoustical Transmission Path of Additive Printed Metals
Authors: Raphael Rehmet, Armin Lohrengel, Prof Dr-Ing
Abstract:
In terms of making machines more silent and convenient, it is necessary to analyze the transmission path of mechanical vibrations and structure-bone noise. A typical solution for the elimination of structure-bone noise would be to simply add stiffeners or additional masses to change the transmission behavior and, thereby, avoid the propagation of vibrations. Another solution could be to use materials with a different damping behavior, such as elastomers, to isolate the machine dynamically. This research approach investigates the damping behavior of additive printed components made from structural steel or titanium, which have been manufactured in the “Laser Powder Bed Fusion“-process. By using the design flexibility which this process comes with, it will be investigated how a local impedance difference will affect the transmission behavior of the specimens.Keywords: 3D-printed, acoustics, dynamics, impedance
Procedia PDF Downloads 2079465 Theoretical Study of the Structural and Elastic Properties of Semiconducting Rare Earth Chalcogenide Sm1-XEuXS under Pressure
Authors: R. Dubey, M. Sarwan, S. Singh
Abstract:
We have investigated the phase transition pressure and associated volume collapse in Sm1– X EuX S alloy (0≤x≤1) which shows transition from discontinuous to continuous as x is reduced. The calculated results from present approach are in good agreement with experimental data available for the end point members (x=0 and x=1). The results for the alloy counter parts are also in fair agreement with experimental data generated from the vegard’s law. An improved interaction potential model has been developed which includes coulomb, three body interaction, polarizability effect and overlap repulsive interaction operative up to second neighbor ions. It is found that the inclusion of polarizability effect has improved our results.Keywords: elastic constants, high pressure, phase transition, rare earth compound
Procedia PDF Downloads 4199464 Emerging Positive Education Interventions for Clean Sport Behavior: A Pilot Study
Authors: Zeinab Zaremohzzabieh, Syasya Firzana Azmi, Haslinda Abdullah, Soh Kim Geok, Aini Azeqa Ma'rof, Hayrol Azril Mohammed Shaffril
Abstract:
The escalating prevalence of doping in sports, casting a shadow over both high-performance and recreational settings, has emerged as a formidable concern, particularly within the realm of young athletes. Doping, characterized by the surreptitious use of prohibited substances to gain a competitive edge, underscores the pressing need for comprehensive and efficacious preventive measures. This study aims to address a crucial void in current research by unraveling the motivations that drive clean adolescent athletes to steadfastly abstain from performance-enhancing substances. In navigating this intricate landscape, the study adopts a positive psychology perspective, investigating into the conditions and processes that contribute to the holistic well-being of individuals and communities. At the heart of this exploration lies the application of the PERMA model, a comprehensive positive psychology framework encapsulating positive emotion, engagement, relationships, meaning, and accomplishments. This model functions as a distinctive lens, dissecting intervention results to offer nuanced insights into the complex dynamics of clean sport behavior. The research is poised to usher in a paradigm shift from conventional anti-doping strategies, predominantly fixated on identifying deficits, towards an innovative approach firmly rooted in positive psychology. The objective of this study is to evaluate the efficacy of a positive education intervention program tailored to promote clean sport behavior among Malaysian adolescent athletes. Representing unexplored terrain within the landscape of anti-doping efforts, this initiative endeavors to reshape the focus from deficiencies to strengths. The meticulously crafted pilot study engages thirty adolescent athletes, divided into a control group of 15 and an experimental group of 15. The pilot study serves as the crucible to assess the effectiveness of the prepared intervention package, providing indispensable insights that will meticulously guide the finalization of an all-encompassing intervention program for the main study. The main study adopts a pioneering two-arm randomized control trial methodology, actively involving adolescent athletes from diverse Malaysian high schools. This approach aims to address critical lacunae in anti-doping strategies, specifically calibrated to resonate with the unique context of Malaysian schools. The study, cognizant of the imperative to develop preventive measures harmonizing with the cultural and educational milieu of Malaysian adolescent athletes, aspires to cultivate a culture of clean sport. In conclusion, this research aspires to contribute unprecedented insights into the efficacy of positive education interventions firmly rooted in the PERMA model. By unraveling the intricacies of clean sport behavior, particularly within the context of Malaysian adolescent athletes, the study seeks to introduce transformative preventive methods. The adoption of positive psychology as an avant-garde anti-doping tool represents an innovative and promising approach, bridging a conspicuous gap in scholarly research and offering potential panaceas for the sporting community. As this study unfurls its chapters, it carries the promise not only to enrich our understanding of clean sport behavior but also to pave the way for positive metamorphosis within the realm of adolescent sports in Malaysia.Keywords: positive education interventions, a pilot study, clean sport behavior, adolescent athletes, Malaysia
Procedia PDF Downloads 589463 A Data-Driven Optimal Control Model for the Dynamics of Monkeypox in a Variable Population with a Comprehensive Cost-Effectiveness Analysis
Authors: Martins Onyekwelu Onuorah, Jnr Dahiru Usman
Abstract:
Introduction: In the realm of public health, the threat posed by Monkeypox continues to elicit concern, prompting rigorous studies to understand its dynamics and devise effective containment strategies. Particularly significant is its recurrence in variable populations, such as the observed outbreak in Nigeria in 2022. In light of this, our study undertakes a meticulous analysis, employing a data-driven approach to explore, validate, and propose optimized intervention strategies tailored to the distinct dynamics of Monkeypox within varying demographic structures. Utilizing a deterministic mathematical model, we delved into the intricate dynamics of Monkeypox, with a particular focus on a variable population context. Our qualitative analysis provided insights into the disease-free equilibrium, revealing its stability when R0 is less than one and discounting the possibility of backward bifurcation, as substantiated by the presence of a single stable endemic equilibrium. The model was rigorously validated using real-time data from the Nigerian 2022 recorded cases for Epi weeks 1 – 52. Transitioning from qualitative to quantitative, we augmented our deterministic model with optimal control, introducing three time-dependent interventions to scrutinize their efficacy and influence on the epidemic's trajectory. Numerical simulations unveiled a pronounced impact of the interventions, offering a data-supported blueprint for informed decision-making in containing the disease. A comprehensive cost-effectiveness analysis employing the Infection Averted Ratio (IAR), Average Cost-Effectiveness Ratio (ACER), and Incremental Cost-Effectiveness Ratio (ICER) facilitated a balanced evaluation of the interventions’ economic and health impacts. In essence, our study epitomizes a holistic approach to understanding and mitigating Monkeypox, intertwining rigorous mathematical modeling, empirical validation, and economic evaluation. The insights derived not only bolster our comprehension of Monkeypox's intricate dynamics but also unveil optimized, cost-effective interventions. This integration of methodologies and findings underscores a pivotal stride towards aligning public health imperatives with economic sustainability, marking a significant contribution to global efforts in combating infectious diseases.Keywords: monkeypox, equilibrium states, stability, bifurcation, optimal control, cost-effectiveness
Procedia PDF Downloads 859462 An Automatic Method for Building Learners’ Groups in Virtual Environment
Authors: O. Bourkoukou, Essaid El Bachari
Abstract:
The group composing is one of the key issue in collaborative learning to achieve a positive educational experience. The goal of this work is to propose for teachers and tutors a method to create effective collaborative learning groups in e-learning environment based on the learner profile. For this purpose, a new function was defined to rate implicitly learning objects used by the learner during his learning experience. This paper describes the proposed algorithm to build an adequate collaborative learning group. In order to verify the performance of the proposed algorithm, several experiments were conducted in real data set in virtual environment. Results show the effectiveness of the method for which it appears that the proposed approach may be promising to produce better outcomes.Keywords: building groups, collaborative learning, e-learning, learning objects
Procedia PDF Downloads 2979461 Real Activities Manipulation vs. Accrual Earnings Management: The Effect of Political Risk
Authors: Heba Abdelmotaal, Magdy Abdel-Kader
Abstract:
Purpose: This study explores whether a firm’s effective political risk management is preventing real and accrual earnings management . Design/methodology/approach: Based on a sample of 130 firms operating in Egypt during the period 2008-2013, two hypotheses are tested using the panel data regression models. Findings: The empirical findings indicate a significant relation between real and accrual earnings management and political risk. Originality/value: This paper provides a statistically evidence on the effects of the political risk management failure on the mangers’ engagement in the real and accrual earnings management practices, and its impact on the firm’s performance.Keywords: political risk, risk management failure, real activities manipulation, accrual earnings management
Procedia PDF Downloads 4389460 STEM Curriculum Development Using Robotics with K-12 Students in Brazil
Authors: Flavio Campos
Abstract:
This paper describes an implementation of a STEM curriculum program using robotics as a technological resource at a private school in Brazil. Emphasized the pedagogic and didactic aspects and brings a discussion about STEM curriculum and the perspective of using robotics and the relation between curriculum, science and technologies into the learning process. The results indicate that STEM curriculum integration with robotics as a technological resource in K-12 students learning process has complex aspects, such as relation between time/space, the development of educators and the relation between robotics and other subjects. Therefore, the comprehension of these aspects could indicate some steps that we should consider when integrating STEM basis and robotics into curriculum, which can improve education for science and technology significantly.Keywords: STEM curriculum, educational robotics, constructionist approach, education and technology
Procedia PDF Downloads 3429459 Angiotensin Converting Enzyme Gene Polymorphism Studies: A Case-Control Study
Authors: Salina Y. Saddick
Abstract:
Mild gestational hyperglycemia (MGH) is a very common complication of pregnancy that is characterized by intolerance to glucose. The association of angiotensin-converting enzyme (ACE) insertion/deletion (I/D) polymorphism to MGH has been previously reported. In this study, we evaluated the association between ACE polymorphism and the risk of MGH in a Saudi population. We conducted a case-control study in a population of 100 MGH patients and 100 control subjects. ACE gene polymorphism was analyzed by the novel approach of tetraprimer amplification refractory mutation system (ARMS)-polymerase chain reaction (PCR). The frequency of ACE polymorphism was not associated with either alleles or genotypes in MGH patients. Glucose concentration was found to be significantly associated with the MGH group. Our study suggests that ACE genotypes were not associated with ACE polymorphism in a Saudi population.Keywords: MGH, ACE, insertion polymorphism, deletion polymorphism
Procedia PDF Downloads 3199458 Evaluation of Disease Risk Variables in the Control of Bovine Tuberculosis
Authors: Berrin Şentürk
Abstract:
In this study, due to the recurrence of bovine tuberculosis, in the same areas, the risk factors for the disease were determined and evaluated at the local level. This study was carried out in 32 farms where the disease was detected in the district and center of Samsun province in 2014. Predetermined risk factors, such as farm, environmental and economic risks, were investigated with the survey method. It was predetermined that risks in the three groups are similar to the risk variables of the disease on the global scale. These risk factors that increase the susceptibility of the infection must be understood by the herd owners. The risk-based contagious disease management system approach should be applied for bovine tuberculosis by farmers, animal health professionals and public and private sector decision makers.Keywords: bovine tuberculosis, disease management, control, outbreak, risk analysis
Procedia PDF Downloads 4029457 Designing a Socio-Technical System for Groundwater Resources Management, Applying Smart Energy and Water Meter
Authors: S. Mahdi Sadatmansouri, Maryam Khalili
Abstract:
World, nowadays, encounters serious water scarcity problem. During the past few years, by advent of Smart Energy and Water Meter (SEWM) and its installation at the electro-pumps of the water wells, one had believed that it could be the golden key to address the groundwater resources over-pumping issue. In fact, implementation of these Smart Meters managed to control the water table drawdown for short; but it was not a sustainable approach. SEWM has been considered as law enforcement facility at first; however, for solving a complex socioeconomic problem like shared groundwater resources management, more than just enforcement is required: participation to conserve common resources. The well owners or farmers, as water consumers, are the main and direct stakeholders of this system and other stakeholders could be government sectors, investors, technology providers, privet sectors or ordinary people. Designing a socio-technical system not only defines the role of each stakeholder but also can lubricate the communication to reach the system goals while benefits of each are considered and provided. Farmers, as the key participators for solving groundwater problem, do not trust governments but they would trust a fair system in which responsibilities, privileges and benefits are clear. Technology could help this system remained impartial and productive. Social aspects provide rules, regulations, social objects and etc. for the system and help it to be more human-centered. As the design methodology, Design Thinking provides probable solutions for the challenging problems and ongoing conflicts; it could enlighten the way in which the final system could be designed. Using Human Centered Design approach of IDEO helps to keep farmers in the center of the solution and provides a vision by which stakeholders’ requirements and needs are addressed effectively. Farmers would be considered to trust the system and participate in their groundwater resources management if they find the rules and tools of the system fair and effective. Besides, implementation of the socio-technical system could change farmers’ behavior in order that they concern more about their valuable shared water resources as well as their farm profit. This socio-technical system contains nine main subsystems: 1) Measurement and Monitoring system, 2) Legislation and Governmental system, 3) Information Sharing system, 4) Knowledge based NGOs, 5) Integrated Farm Management system (using IoT), 6) Water Market and Water Banking system, 7) Gamification, 8) Agribusiness ecosystem, 9) Investment system.Keywords: human centered design, participatory management, smart energy and water meter (SEWM), social object, socio-technical system, water table drawdown
Procedia PDF Downloads 2949456 Sufficient Conditions for Exponential Stability of Stochastic Differential Equations with Non Trivial Solutions
Authors: Fakhreddin Abedi, Wah June Leong
Abstract:
Exponential stability of stochastic differential equations with non trivial solutions is provided in terms of Lyapunov functions. The main result of this paper establishes that, under certain hypotheses for the dynamics f(.) and g(.), practical exponential stability in probability at the small neighborhood of the origin is equivalent to the existence of an appropriate Lyapunov function. Indeed, we establish exponential stability of stochastic differential equation when almost all the state trajectories are bounded and approach a sufficiently small neighborhood of the origin. We derive sufficient conditions for exponential stability of stochastic differential equations. Finally, we give a numerical example illustrating our results.Keywords: exponential stability in probability, stochastic differential equations, Lyapunov technique, Ito's formula
Procedia PDF Downloads 529455 A Redesigned Pedagogy in Introductory Programming Reduces Failure and Withdrawal Rates by Half
Authors: Said Fares, Mary Fares
Abstract:
It is well documented that introductory computer programming courses are difficult and that failure rates are high. The aim of this project was to reduce the high failure and withdrawal rates in learning to program. This paper presents a number of changes in module organization and instructional delivery system in teaching CS1. Daily out of class help sessions and tutoring services were applied, interactive lectures and laboratories, online resources, and timely feedback were introduced. Five years of data of 563 students in 21 sections was collected and analyzed. The primary results show that the failure and withdrawal rates were cut by more than half. Student surveys indicate a positive evaluation of the modified instructional approach, overall satisfaction with the course and consequently, higher success and retention rates.Keywords: failure rate, interactive learning, student engagement, CS1
Procedia PDF Downloads 3089454 Deficits and Solutions in the Development of Modular Factory Systems
Authors: Achim Kampker, Peter Burggräf, Moritz Krunke, Hanno Voet
Abstract:
As a reaction to current challenges in factory planning, many companies think about introducing factory standards to lower planning times and decrease planning costs. If these factory standards are set-up with a high level of modularity, they are defined as modular factory systems. This paper deals with the main current problems in the application of modular factory systems in practice and presents a solution approach with its basic models. The methodology is based on methods from factory planning but also uses the tools of other disciplines like product development or technology management to deal with the high complexity, which the development of modular factory systems implies. The four basic models that such a methodology has to contain are introduced and pointed out.Keywords: factory planning, modular factory systems, factory standards, cost-benefit analysis
Procedia PDF Downloads 5959453 The Research of Effectiveness of Animal Protection Act Implementation Reducing Animal Abuse
Authors: Yu Ling Chang
Abstract:
Since the United Nations announced Universal Declaration of Human Rights in 1948, people are paying more and more attention to the value of lives. On the other hand, life education is being vigorously pushed in different countries. Unfortunately, the results have been only moderately successful by reason that the concept is not implemented in everyone’s daily life. Even worse, animal abuse and killing events keep happening. This research is focused on generalizing a conclusion from different countries’ Animal Protection Act and actual execution by case studies, in order to make an approach of whether the number of animal abuse is directly influenced by different laws and regimes or not. It concludes the central notion and spirit of Animal Protection Act in German, Japan, and Taiwan. Providing the reference of specific schemes and analysis based on Taiwanese social culture.Keywords: animal abuse, Animal Management Act, Animal Protection Act, social culture
Procedia PDF Downloads 2149452 A Measuring Industrial Resiliency by Using Data Envelopment Analysis Approach
Authors: Ida Bagus Made Putra Jandhana, Teuku Yuri M. Zagloel, Rahmat Nurchayo
Abstract:
Having several crises that affect industrial sector performance in the past decades, decision makers should utilize measurement application that enables them to measure industrial resiliency more precisely. It provides not only a framework for the development of resilience measurement application, but also several theories for the concept building blocks, such as performance measurement management, and resilience engineering in real world environment. This research is a continuation of previously published paper on performance measurement in the industrial sector. Finally, this paper contributes an alternative performance measurement method in industrial sector based on resilience concept. Moreover, this research demonstrates how applicable the concept of resilience engineering is and its method of measurement.Keywords: industrial, measurement, resilience, sector
Procedia PDF Downloads 2769451 Narratives and Meta-Narratives in the News of People Killed in 2022 Iranian Protests
Authors: Abbas Rezaei Samarin
Abstract:
In October 2022, protests began following the death of Mahsa Amini and were followed by the deaths of those arrested by Iran's morality police which Iran's official media and foreign Persian-language satellite channels presented to the audience different narratives of how they were killed. These two types of media produced two different and sometimes conflicting narratives when faced with the news of a certain person's death, and the conflict is found between the narratives in some cases. This study has focused on the semiotics of these narratives, the interpretation of discourses supporting the narratives, and finally, their analysis within the framework of narrative theories. In the present study, the researcher has used a qualitative approach and has concluded that the narrative of both types of media is structured around the functions of the existing and ideal political system.Keywords: narrative, iran, fake news, protests, manipulation of reality
Procedia PDF Downloads 989450 An As-Is Analysis and Approach for Updating Building Information Models and Laser Scans
Authors: Rene Hellmuth
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring of the factory building is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A building information model (BIM) is the planning basis for rebuilding measures and becomes an indispensable data repository to be able to react quickly to changes. Use as a planning basis for restructuring measures in factories only succeeds if the BIM model has adequate data quality. Under this aspect and the industrial requirement, three data quality factors are particularly important for this paper regarding the BIM model: up-to-dateness, completeness, and correctness. The research question is: how can a BIM model be kept up to date with required data quality and which visualization techniques can be applied in a short period of time on the construction site during conversion measures? An as-is analysis is made of how BIM models and digital factory models (including laser scans) are currently being kept up to date. Industrial companies are interviewed, and expert interviews are conducted. Subsequently, the results are evaluated, and a procedure conceived how cost-effective and timesaving updating processes can be carried out. The availability of low-cost hardware and the simplicity of the process are of importance to enable service personnel from facility mnagement to keep digital factory models (BIM models and laser scans) up to date. The approach includes the detection of changes to the building, the recording of the changing area, and the insertion into the overall digital twin. Finally, an overview of the possibilities for visualizations suitable for construction sites is compiled. An augmented reality application is created based on an updated BIM model of a factory and installed on a tablet. Conversion scenarios with costs and time expenditure are displayed. A user interface is designed in such a way that all relevant conversion information is available at a glance for the respective conversion scenario. A total of three essential research results are achieved: As-is analysis of current update processes for BIM models and laser scans, development of a time-saving and cost-effective update process and the conception and implementation of an augmented reality solution for BIM models suitable for construction sites.Keywords: building information modeling, digital factory model, factory planning, restructuring
Procedia PDF Downloads 1149449 Tumor Detection Using Convolutional Neural Networks (CNN) Based Neural Network
Authors: Vinai K. Singh
Abstract:
In Neural Network-based Learning techniques, there are several models of Convolutional Networks. Whenever the methods are deployed with large datasets, only then can their applicability and appropriateness be determined. Clinical and pathological pictures of lobular carcinoma are thought to exhibit a large number of random formations and textures. Working with such pictures is a difficult problem in machine learning. Focusing on wet laboratories and following the outcomes, numerous studies have been published with fresh commentaries in the investigation. In this research, we provide a framework that can operate effectively on raw photos of various resolutions while easing the issues caused by the existence of patterns and texturing. The suggested approach produces very good findings that may be used to make decisions in the diagnosis of cancer.Keywords: lobular carcinoma, convolutional neural networks (CNN), deep learning, histopathological imagery scans
Procedia PDF Downloads 1369448 Threading Professionalism Through Occupational Therapy Curriculum: A Framework and Resources
Authors: Ashley Hobson, Ashley Efaw
Abstract:
Professionalism is an essential skill for clinicians, particularly for Occupational Therapy Providers (OTPs). The World Federation of Occupational Therapy (WFOT) Guiding Principles for Ethical Occupational Therapy and American Occupational Therapy Association (AOTA) Code of Ethics establishes expectations for professionalism among OTPs, emphasizing its importance in the field. However, the teaching and assessment of professionalism vary across OTP programs. The flexibility provided by the country standards allows programs to determine their own approaches to meeting these standards, resulting in inconsistency. Educators in both academic and fieldwork settings face challenges in objectively assessing and providing feedback on student professionalism. Although they observe instances of unprofessional behavior, there is no standardized assessment measure to evaluate professionalism in OTP students. While most students are committed to learning and applying professionalism skills, they enter OTP programs with varying levels of proficiency in this area. Consequently, they lack a uniform understanding of professionalism and lack an objective means to self-assess their current skills and identify areas for growth. It is crucial to explicitly teach professionalism, have students to self-assess their professionalism skills, and have OTP educators assess student professionalism. This approach is necessary for fostering students' professionalism journeys. Traditionally, there has been no objective way for students to self-assess their professionalism or for educators to provide objective assessments and feedback. To establish a uniform approach to professionalism, the authors incorporated professionalism content into our curriculum. Utilizing an operational definition of professionalism, the authors integrated professionalism into didactic, fieldwork, and capstone courses. The complexity of the content and the professionalism skills expected of students increase each year to ensure students graduate with the skills to practice in accordance with the WFOT Guiding Principles for Ethical Occupational Therapy Practice and AOTA Code of Ethics. Two professionalism assessments were developed based on the expectations outlined in the both documents. The Professionalism Self-Assessment allows students to evaluate their professionalism, reflect on their performance, and set goals. The Professionalism Assessment for Educators is a modified version of the same tool designed for educators. The purpose of this workshop is to provide educators with a framework and tools for assessing student professionalism. The authors discuss how to integrate professionalism content into OTP curriculum and utilize professionalism assessments to provide constructive feedback and equitable learning opportunities for OTP students in academic, fieldwork, and capstone settings. By adopting these strategies, educators can enhance the development of professionalism among OTP students, ensuring they are well-prepared to meet the demands of the profession.Keywords: professionalism, assessments, student learning, student preparedness, ethical practice
Procedia PDF Downloads 419447 Artificial Intelligence Based Comparative Analysis for Supplier Selection in Multi-Echelon Automotive Supply Chains via GEP and ANN Models
Authors: Seyed Esmail Seyedi Bariran, Laysheng Ewe, Amy Ling
Abstract:
Since supplier selection appears as a vital decision, selecting supplier based on the best and most accurate ways has a lot of importance for enterprises. In this study, a new Artificial Intelligence approach is exerted to remove weaknesses of supplier selection. The paper has three parts. First part is choosing the appropriate criteria for assessing the suppliers’ performance. Next one is collecting the data set based on experts. Afterwards, the data set is divided into two parts, the training data set and the testing data set. By the training data set the best structure of GEP and ANN are selected and to evaluate the power of the mentioned methods the testing data set is used. The result obtained shows that the accuracy of GEP is more than ANN. Moreover, unlike ANN, a mathematical equation is presented by GEP for the supplier selection.Keywords: supplier selection, automotive supply chains, ANN, GEP
Procedia PDF Downloads 6319446 A Genetic-Neural-Network Modeling Approach for Self-Heating in GaN High Electron Mobility Transistors
Authors: Anwar Jarndal
Abstract:
In this paper, a genetic-neural-network (GNN) based large-signal model for GaN HEMTs is presented along with its parameters extraction procedure. The model is easy to construct and implement in CAD software and requires only DC and S-parameter measurements. An improved decomposition technique is used to model self-heating effect. Two GNN models are constructed to simulate isothermal drain current and power dissipation, respectively. The two model are then composed to simulate the drain current. The modeling procedure was applied to a packaged GaN-on-Si HEMT and the developed model is validated by comparing its large-signal simulation with measured data. A very good agreement between the simulation and measurement is obtained.Keywords: GaN HEMT, computer-aided design and modeling, neural networks, genetic optimization
Procedia PDF Downloads 3829445 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison
Authors: Xiangtuo Chen, Paul-Henry Cournéde
Abstract:
Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest
Procedia PDF Downloads 2319444 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics
Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee
Abstract:
Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru
Procedia PDF Downloads 879443 The Inverse Problem in Energy Beam Processes Using Discrete Adjoint Optimization
Authors: Aitor Bilbao, Dragos Axinte, John Billingham
Abstract:
The inverse problem in Energy Beam (EB) Processes consists of defining the control parameters, in particular the 2D beam path (position and orientation of the beam as a function of time), to arrive at a prescribed solution (freeform surface). This inverse problem is well understood for conventional machining, because the cutting tool geometry is well defined and the material removal is a time independent process. In contrast, EB machining is achieved through the local interaction of a beam of particular characteristics (e.g. energy distribution), which leads to a surface-dependent removal rate. Furthermore, EB machining is a time-dependent process in which not only the beam varies with the dwell time, but any acceleration/deceleration of the machine/beam delivery system, when performing raster paths will influence the actual geometry of the surface to be generated. Two different EB processes, Abrasive Water Machining (AWJM) and Pulsed Laser Ablation (PLA), are studied. Even though they are considered as independent different technologies, both can be described as time-dependent processes. AWJM can be considered as a continuous process and the etched material depends on the feed speed of the jet at each instant during the process. On the other hand, PLA processes are usually defined as discrete systems and the total removed material is calculated by the summation of the different pulses shot during the process. The overlapping of these shots depends on the feed speed and the frequency between two consecutive shots. However, if the feed speed is sufficiently slow compared with the frequency, then consecutive shots are close enough and the behaviour can be similar to a continuous process. Using this approximation a generic continuous model can be described for both processes. The inverse problem is usually solved for this kind of process by simply controlling dwell time in proportion to the required depth of milling at each single pixel on the surface using a linear model of the process. However, this approach does not always lead to the good solution since linear models are only valid when shallow surfaces are etched. The solution of the inverse problem is improved by using a discrete adjoint optimization algorithm. Moreover, the calculation of the Jacobian matrix consumes less computation time than finite difference approaches. The influence of the dynamics of the machine on the actual movement of the jet is also important and should be taken into account. When the parameters of the controller are not known or cannot be changed, a simple approximation is used for the choice of the slope of a step profile. Several experimental tests are performed for both technologies to show the usefulness of this approach.Keywords: abrasive waterjet machining, energy beam processes, inverse problem, pulsed laser ablation
Procedia PDF Downloads 275