Search results for: k-means clustering approach
12233 Recursive Doubly Complementary Filter Design Using Particle Swarm Optimization
Authors: Ju-Hong Lee, Ding-Chen Chung
Abstract:
This paper deals with the optimal design of recursive doubly complementary (DC) digital filter design using a metaheuristic based optimization technique. Based on the theory of DC digital filters using two recursive digital all-pass filters (DAFs), the design problem is appropriately formulated to result in an objective function which is a weighted sum of the phase response errors of the designed DAFs. To deal with the stability of the recursive DC filters during the design process, we can either impose some necessary constraints on the phases of the recursive DAFs. Through a frequency sampling and a weighted least squares approach, the optimization problem of the objective function can be solved by utilizing a population based stochastic optimization approach. The resulting DC digital filters can possess satisfactory frequency response. Simulation results are presented for illustration and comparison.Keywords: doubly complementary, digital all-pass filter, weighted least squares algorithm, particle swarm optimization
Procedia PDF Downloads 68812232 Numerical Solutions of an Option Pricing Rainfall Derivatives Model
Authors: Clarinda Vitorino Nhangumbe, Ercília Sousa
Abstract:
Weather derivatives are financial products used to cover non catastrophic weather events with a weather index as the underlying asset. The rainfall weather derivative pricing model is modeled based in the assumption that the rainfall dynamics follows Ornstein-Uhlenbeck process, and the partial differential equation approach is used to derive the convection-diffusion two dimensional time dependent partial differential equation, where the spatial variables are the rainfall index and rainfall depth. To compute the approximation solutions of the partial differential equation, the appropriate boundary conditions are suggested, and an explicit numerical method is proposed in order to deal efficiently with the different choices of the coefficients involved in the equation. Being an explicit numerical method, it will be conditionally stable, then the stability region of the numerical method and the order of convergence are discussed. The model is tested for real precipitation data.Keywords: finite differences method, ornstein-uhlenbeck process, partial differential equations approach, rainfall derivatives
Procedia PDF Downloads 10512231 A Framework for Successful TQM Implementation and Its Effect on the Organizational Sustainability Development
Authors: Redha Elhuni, M. Munir Ahmad
Abstract:
The main purpose of this research is to construct a generic model for successful implementation of Total Quality Management (TQM) in oil sector, and to find out the effects of this model on the organizational sustainability development (OSD) performance of Libyan oil and gas companies using the structured equation modeling (SEM) approach. The research approach covers both quantitative and qualitative methods. A questionnaire was developed in order to identify the quality factors that are seen by Libyan oil and gas companies to be critical to the success of TQM implementation. Hypotheses were developed to evaluate the impact of TQM implementation on O SD. Data analysis reveals that there is a significant positive effect of the TQM implementation on OSD. 24 quality factors are found to be critical and absolutely essential for successful TQM implementation. The results generated a structure of the TQMSD implementation framework based on the four major road map constructs (Top management commitment, employee involvement and participation, customer-driven processes, and continuous improvement culture).Keywords: total quality management, critical success factors, oil and gas, organizational sustainability development (SD), Libya
Procedia PDF Downloads 27312230 Immiscible Polymer Blends with Controlled Nanoparticle Location for Excellent Microwave Absorption: A Compartmentalized Approach
Authors: Sourav Biswas, Goutam Prasanna Kar, Suryasarathi Bose
Abstract:
In order to obtain better materials, control in the precise location of nanoparticles is indispensable. It was shown here that ordered arrangement of nanoparticles, possessing different characteristics (electrical/magnetic dipoles), in the blend structure can result in excellent microwave absorption. This is manifested from a high reflection loss of ca. -67 dB for the best blend structure designed here. To attenuate electromagnetic radiations, the key parameters i.e. high electrical conductivity and large dielectric/magnetic loss are targeted here using a conducting inclusion [multiwall carbon nanotubes, MWNTs]; ferroelectric nanostructured material with associated relaxations in the GHz frequency [barium titanate, BT]; and a loss ferromagnetic nanoparticles [nickel ferrite, NF]. In this study, bi-continuous structures were designed using 50/50 (by wt) blends of polycarbonate (PC) and polyvinylidene fluoride (PVDF). The MWNTs was modified using an electron acceptor molecule; a derivative of perylenediimide, which facilitates π-π stacking with the nanotubes and stimulates efficient charge transport in the blends. The nanoscopic materials have specific affinity towards the PVDF phase. Hence, by introducing surface-active groups, ordered arrangement can be tailored. To accomplish this, both BT and NF was first hydroxylated followed by introducing amine-terminal groups on the surface. The latter facilitated in nucleophilic substitution reaction with PC and resulted in their precise location. In this study, we have shown for the first time that by compartmentalized approach, superior EM attenuation can be achieved. For instance, when the nanoparticles were localized exclusively in the PVDF phase or in both the phases, the minimum reflection loss was ca. -18 dB (for MWNT/BT mixture) and -29 dB (for MWNT/NF mixture), and the shielding was primarily through reflection. Interestingly, by adopting the compartmentalized approach where in, the lossy materials were in the PC phase and the conducting inclusion (MWNT) in PVDF, an outstanding reflection loss of ca. -57 dB (for BT and MWNT combination) and -67 dB (for NF and MWNT combination) was noted and the shielding was primarily through absorption. Thus, the approach demonstrates that nanoscopic structuring in the blends can be achieved under macroscopic processing conditions and this strategy can further be explored to design microwave absorbers.Keywords: barium titanate, EMI shielding, MWNTs, nickel ferrite
Procedia PDF Downloads 44712229 An Abductive Approach to Policy Analysis: Policy Analysis as Informed Guessing
Authors: Adrian W. Chew
Abstract:
This paper argues that education policy analysis tends to be steered towards empiricist oriented approaches, which place emphasis on objective and measurable data. However, this paper argues that empiricist oriented approaches are generally based on inductive and/or deductive reasoning, which are unable to generate new ideas/knowledge. This paper will outline the logical structure of induction, deduction, and abduction, and argues that only abduction provides possibilities for the creation of new ideas/knowledge. This paper proposes the neologism of ‘informed guessing’ as a reformulation of abduction, and also as an approach to education policy analysis. On one side, the signifier ‘informed’ encapsulates the idea that abductive policy analysis needs to be informed by descriptive conceptualization theory to be able to make relations and connections between, and within, observed phenomenon and unobservable general structures. On the other side, the signifier ‘guessing’ captures the cyclical and unsystematic process of abduction. This paper will end with a brief example of utilising ‘informed guessing’ for a policy analysis of school choice lotteries in the United States.Keywords: abductive reasoning, empiricism, informed guessing, policy analysis
Procedia PDF Downloads 35212228 Analyses and Optimization of Physical and Mechanical Properties of Direct Recycled Aluminium Alloy (AA6061) Wastes by ANOVA Approach
Authors: Mohammed H. Rady, Mohd Sukri Mustapa, S Shamsudin, M. A. Lajis, A. Wagiman
Abstract:
The present study is aimed at investigating microhardness and density of aluminium alloy chips when subjected to various settings of preheating temperature and preheating time. Three values of preheating temperature were taken as 450 °C, 500 °C, and 550 °C. On the other hand, three values of preheating time were chosen (1, 2, 3) hours. The influences of the process parameters (preheating temperature and time) were analyzed using Design of Experiments (DOE) approach whereby full factorial design with center point analysis was adopted. The total runs were 11 and they comprise of two factors of full factorial design with 3 center points. The responses were microhardness and density. The results showed that the density and microhardness increased with decreasing the preheating temperature. The results also found that the preheating temperature is more important to be controlled rather than the preheating time in microhardness analysis while both the preheating temperature and preheating time are important in density analysis. It can be concluded that setting temperature at 450 °C for 1 hour resulted in the optimum responses.Keywords: AA6061, density, DOE, hot extrusion, microhardness
Procedia PDF Downloads 34912227 Analysis of Fault Tolerance on Grid Computing in Real Time Approach
Authors: Parampal Kaur, Deepak Aggarwal
Abstract:
In the computational Grid, fault tolerance is an imperative issue to be considered during job scheduling. Due to the widespread use of resources, systems are highly prone to errors and failures. Hence, fault tolerance plays a key role in the grid to avoid the problem of unreliability. Scheduling the task to the appropriate resource is a vital requirement in computational Grid. The fittest resource scheduling algorithm searches for the appropriate resource based on the job requirements, in contrary to the general scheduling algorithms where jobs are scheduled to the resources with best performance factor. The proposed method is to improve the fault tolerance of the fittest resource scheduling algorithm by scheduling the job in coordination with job replication when the resource has low reliability. Based on the reliability index of the resource, the resource is identified as critical. The tasks are scheduled based on the criticality of the resources. Results show that the execution time of the tasks is comparatively reduced with the proposed algorithm using real-time approach rather than a simulator.Keywords: computational grid, fault tolerance, task replication, job scheduling
Procedia PDF Downloads 43612226 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach
Authors: D. Tedesco, G. Feletti, P. Trucco
Abstract:
The present study aims to develop a Decision Support System (DSS) to support the operational decision of the Emergency Medical Service (EMS) regarding the assignment of medical emergency requests to Emergency Departments (ED). In the literature, this problem is also known as “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies are mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a request. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the transport time and release the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs, considering information relating to the subsequent phases of the process, such as the case-mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to evaluate different hospital selection policies. Therefore, the next steps of the research consisted of the development of a general simulation architecture, its implementation in the AnyLogic software and its validation on a realistic dataset. The hospital selection policy that produced the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, which is based on a retrospective estimate of the TTP, and a dynamic approach, which is based on a predictive estimate of the TTP determined with a constantly updated Winters model. Findings reveal that considering the minimization of TTP as a hospital selection policy raises several benefits. It allows to significantly reduce service throughput times in the ED with a minimum increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case-mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms of TTP estimation than a retrospective approach but entails a more difficult application. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.Keywords: discrete event simulation, emergency medical services, forecast model, hospital selection
Procedia PDF Downloads 9012225 Software User Experience Enhancement through User-Centered Design and Co-design Approach
Authors: Shan Wang, Fahad Alhathal, Hari Subramanian
Abstract:
User-centered design skills play an important role in crafting a positive and intuitive user experience for software applications. Embracing a user-centric design approach involves understanding the needs, preferences, and behaviors of the end-users throughout the design process. This mindset not only enhances the usability of the software but also fosters a deeper connection between the digital product and its users. This paper encompasses a 6-month knowledge exchange collaboration project between an academic institution and an external industry in 2023 in the UK; it aims to improve the user experience of a digital platform utilized for a knowledge management tool, to understand users' preferences for features, identify sources of frustration, and pinpoint areas for enhancement. This research conducted one of the most effective methods to implement user-centered design through co-design workshops for testing user onboarding experiences that involve the active participation of users in the design process. More specifically, in January 2023, we organized eight co-design workshops with a diverse group of 11 individuals. Throughout these co-design workshops, we accumulated a total of 11 hours of qualitative data in both video and audio formats. Subsequently, we conducted an analysis of user journeys, identifying common issues and potential areas for improvement within three insights. This analysis was pivotal in guiding the knowledge management software in prioritizing feature enhancements and design improvements. Employing a user-centered design thinking process, we developed a series of graphic design solutions in collaboration with the software management tool company. These solutions were targeted at refining onboarding user experiences, workplace interfaces, and interactive design. Some of these design solutions were translated into tangible interfaces for the knowledge management tool. By actively involving users in the design process and valuing their input, developers can create products that are not only functional but also resonate with the end-users, ultimately leading to greater success in the competitive software landscape. In conclusion, this paper not only contributes insights into designing onboarding user experiences for software within a co-design approach but also presents key theories on leveraging the user-centered design process in software design to enhance overall user experiences.Keywords: user experiences design, user centered design, co-design approach, knowledge management tool
Procedia PDF Downloads 812224 Money Laundering Risk Assessment in the Banking Institutions: An Experimental Approach
Authors: Yusarina Mat-Isa, Zuraidah Mohd-Sanusi, Mohd-Nizal Haniff, Paul A. Barnes
Abstract:
In view that money laundering has become eminent for banking institutions, it is an obligation for the banking institutions to adopt a risk-based approach as the integral component of the accepted policies on anti-money laundering. In doing so, those involved with the banking operations are the most critical group of personnel as these are the people who deal with the day-to-day operations of the banking institutions and are obligated to form a judgement on the level of impending risk. This requirement is extended to all relevant banking institutions staff, such as tellers and customer account representatives for them to identify suspicious customers and escalate it to the relevant authorities. Banking institutions staffs, however, face enormous challenges in identifying and distinguishing money launderers from other legitimate customers seeking genuine banking transactions. Banking institutions staffs are mostly educated and trained with the business objective in mind to serve the customers and are not trained to be “detectives with a detective’s power of observation”. Despite increasing awareness as well as trainings conducted for the banking institutions staff, their competency in assessing money laundering risk is still insufficient. Several gaps have prompted this study including the lack of behavioural perspectives in the assessment of money laundering risk in the banking institutions. Utilizing experimental approach, respondents are randomly assigned within a controlled setting with manipulated situations upon which judgement of the respondents is solicited based on various observations related to the situations. The study suggests that it is imperative that informed judgement is exercised in arriving at the decision to proceed with the banking services required by the customers. Judgement forms a basis of opinion for the banking institution staff to decide if the customers posed money laundering risk. Failure to exercise good judgement could results in losses and absorption of unnecessary risk into the banking institutions. Although the banking institutions are exposed with choices of automated solutions in assessing money laundering risk, the human factor in assessing the risk is indispensable. Individual staff in the banking institutions is the first line of defence who are responsible for screening the impending risk of any customer soliciting for banking services. At the end of the spectrum, the individual role involvement on the subject of money laundering risk assessment is not a substitute for automated solutions as human judgement is inimitable.Keywords: banking institutions, experimental approach, money laundering, risk assessment
Procedia PDF Downloads 26712223 Parameters Identification and Sensitivity Study for Abrasive WaterJet Milling Model
Authors: Didier Auroux, Vladimir Groza
Abstract:
This work is part of STEEP Marie-Curie ITN project, and it focuses on the identification of unknown parameters of the proposed generic Abrasive WaterJet Milling (AWJM) PDE model, that appears as an ill-posed inverse problem. The necessity of studying this problem comes from the industrial milling applications where the possibility to predict and model the final surface with high accuracy is one of the primary tasks in the absence of any knowledge of the model parameters that should be used. In this framework, we propose the identification of model parameters by minimizing a cost function, measuring the difference between experimental and numerical solutions. The adjoint approach based on corresponding Lagrangian gives the opportunity to find out the unknowns of the AWJM model and their optimal values that could be used to reproduce the required trench profile. Due to the complexity of the nonlinear problem and a large number of model parameters, we use an automatic differentiation software tool (TAPENADE) for the adjoint computations. By adding noise to the artificial data, we show that in fact the parameter identification problem is highly unstable and strictly depends on input measurements. Regularization terms could be effectively used to deal with the presence of data noise and to improve the identification correctness. Based on this approach we present results in 2D and 3D of the identification of the model parameters and of the surface prediction both with self-generated data and measurements obtained from the real production. Considering different types of model and measurement errors allows us to obtain acceptable results for manufacturing and to expect the proper identification of unknowns. This approach also gives us the ability to distribute the research on more complex cases and consider different types of model and measurement errors as well as 3D time-dependent model with variations of the jet feed speed.Keywords: Abrasive Waterjet Milling, inverse problem, model parameters identification, regularization
Procedia PDF Downloads 31612222 Disaggregating and Forecasting the Total Energy Consumption of a Building: A Case Study of a High Cooling Demand Facility
Authors: Juliana Barcelos Cordeiro, Khashayar Mahani, Farbod Farzan, Mohsen A. Jafari
Abstract:
Energy disaggregation has been focused by many energy companies since energy efficiency can be achieved when the breakdown of energy consumption is known. Companies have been investing in technologies to come up with software and/or hardware solutions that can provide this type of information to the consumer. On the other hand, not all people can afford to have these technologies. Therefore, in this paper, we present a methodology for breaking down the aggregate consumption and identifying the highdemanding end-uses profiles. These energy profiles will be used to build the forecast model for optimal control purpose. A facility with high cooling load is used as an illustrative case study to demonstrate the results of proposed methodology. We apply a high level energy disaggregation through a pattern recognition approach in order to extract the consumption profile of its rooftop packaged units (RTUs) and present a forecast model for the energy consumption.Keywords: energy consumption forecasting, energy efficiency, load disaggregation, pattern recognition approach
Procedia PDF Downloads 27812221 Using Heat-Mask in the Thermoforming Machine for Component Positioning in Thermoformed Electronics
Authors: Behnam Madadnia
Abstract:
For several years, 3D-shaped electronics have been rising, with many uses in home appliances, automotive, and manufacturing. One of the biggest challenges in the fabrication of 3D shape electronics, which are made by thermoforming, is repeatable and accurate component positioning, and typically there is no control over the final position of the component. This paper aims to address this issue and present a reliable approach for guiding the electronic components in the desired place during thermoforming. We have proposed a heat-control mask in the thermoforming machine to control the heating of the polymer, not allowing specific parts to be formable, which can assure the conductive traces' mechanical stability during thermoforming of the substrate. We have verified our approach's accuracy by applying our method on a real industrial semi-sphere mold for positioning 7 LEDs and one touch sensor. We measured the LEDs' position after thermoforming to prove the process's repeatability. The experiment results demonstrate that the proposed method is capable of positioning electronic components in thermoformed 3D electronics with high precision.Keywords: 3D-shaped electronics, electronic components, thermoforming, component positioning
Procedia PDF Downloads 9712220 A Hybrid Pareto-Based Swarm Optimization Algorithm for the Multi-Objective Flexible Job Shop Scheduling Problems
Authors: Aydin Teymourifar, Gurkan Ozturk
Abstract:
In this paper, a new hybrid particle swarm optimization algorithm is proposed for the multi-objective flexible job shop scheduling problem that is very important and hard combinatorial problem. The Pareto approach is used for solving the multi-objective problem. Several new local search heuristics are integrated into an algorithm based on the critical block concept to enhance the performance of the algorithm. The algorithm is compared with the recently published multi-objective algorithms based on benchmarks selected from the literature. Several metrics are used for quantifying performance and comparison of the achieved solutions. The algorithms are also compared based on the Weighting summation of objectives approach. The proposed algorithm can find the Pareto solutions more efficiently than the compared algorithms in less computational time.Keywords: swarm-based optimization, local search, Pareto optimality, flexible job shop scheduling, multi-objective optimization
Procedia PDF Downloads 36812219 A Formal Property Verification for Aspect-Oriented Programs in Software Development
Authors: Moustapha Bande, Hakima Ould-Slimane, Hanifa Boucheneb
Abstract:
Software development for complex systems requires efficient and automatic tools that can be used to verify the satisfiability of some critical properties such as security ones. With the emergence of Aspect-Oriented Programming (AOP), considerable work has been done in order to better modularize the separation of concerns in the software design and implementation. The goal is to prevent the cross-cutting concerns to be scattered across the multiple modules of the program and tangled with other modules. One of the key challenges in the aspect-oriented programs is to be sure that all the pieces put together at the weaving time ensure the satisfiability of the overall system requirements. Our paper focuses on this problem and proposes a formal property verification approach for a given property from the woven program. The approach is based on the control flow graph (CFG) of the woven program, and the use of a satisfiability modulo theories (SMT) solver to check whether each property (represented par one aspect) is satisfied or not once the weaving is done.Keywords: aspect-oriented programming, control flow graph, property verification, satisfiability modulo theories
Procedia PDF Downloads 17612218 The Differentiation of Performances among Immigrant Entrepreneurs: A Biographical Approach
Authors: Daniela Gnarini
Abstract:
This paper aims to contribute to the field of immigrants' entrepreneurial performance. The debate on immigrant entrepreneurship has been dominated by cultural explanations, which argue that immigrants’ entrepreneurial results are linked to groups’ characteristics. However, this approach does not consider important dimensions that influence entrepreneurial performances. Furthermore, cultural theories do not take into account the huge differences in performances also within the same ethnic group. For these reason, this study adopts a biographical approach, both at theoretical and at methodological level, which can allow to understand the main aspects that make the difference in immigrants' entrepreneurial performances, by exploring the narratives of immigrant entrepreneurs, who operate in the restaurant sector in two different Italian metropolitan areas: Milan and Rome. Through the qualitative method of biographical interviews, this study analyses four main dimensions and their combinations: a) individuals' entrepreneurial and migratory path: this aspect is particularly relevant to understand the biographical resources of immigrant entrepreneurs and their change and evolution during time; b) entrepreneurs' social capital, with a particular focus on their networks, through the adoption of a transnational perspective, that takes into account both the local level and the transnational connections. This study highlights that, though entrepreneurs’ connections are significant, especially as far as those with family members are concerned, often their entrepreneurial path assumes an individualised trajectory. c) Entrepreneurs' human capital, including both formal education and skills acquired through informal channels. The latter are particularly relevant since in the interviews and data collected the role of informal transmission emerges. d) Embeddedness within the social, political and economic context, to understand the main constraints and opportunities both at local and national level. The comparison between two different metropolitan areas within the same country helps to understand this dimension.Keywords: biographies, immigrant entrepreneurs, life stories, performance
Procedia PDF Downloads 22612217 Using a Quantitative Reasoning Framework to Help Students Understand Arc Measure Relationships
Authors: David Glassmeyer
Abstract:
Quantitative reasoning is necessary to robustly understand mathematical concepts ranging from elementary to university levels. Quantitative reasoning involves identifying and representing quantities and the relationships between these quantities. Without reasoning quantitatively, students often resort to memorizing formulas and procedures, which have negative impacts when they encounter mathematical topics in the future. This study investigated how high school students’ quantitative reasoning could be fostered within a unit on arc measure and angle relationships. Arc measure, or the measure of a central angle that cuts off a portion of a circle’s circumference, is often confused with arclength. In this study, the researcher redesigned an activity to clearly distinguish arc measure and arc length by using a quantitative reasoning framework. Data were collected from high school students to determine if this approach impacted their understanding of these concepts. Initial data indicates the approach was successful in supporting students’ quantitative reasoning of these topics. Implications for the work are that teachers themselves may also benefit from considering mathematical definitions from a quantitative reasoning framework and can use this activity in their own classrooms.Keywords: arc length, arc measure, quantitative reasoning, student content knowledge
Procedia PDF Downloads 25812216 Topology Optimization of Heat and Mass Transfer for Two Fluids under Steady State Laminar Regime: Application on Heat Exchangers
Authors: Rony Tawk, Boutros Ghannam, Maroun Nemer
Abstract:
Topology optimization technique presents a potential tool for the design and optimization of structures involved in mass and heat transfer. The method starts with an initial intermediate domain and should be able to progressively distribute the solid and the two fluids exchanging heat. The multi-objective function of the problem takes into account minimization of total pressure loss and maximization of heat transfer between solid and fluid subdomains. Existing methods account for the presence of only one fluid, while the actual work extends optimization distribution of solid and two different fluids. This requires to separate the channels of both fluids and to ensure a minimum solid thickness between them. This is done by adding a third objective function to the multi-objective optimization problem. This article uses density approach where each cell holds two local design parameters ranging from 0 to 1, where the combination of their extremums defines the presence of solid, cold fluid or hot fluid in this cell. Finite volume method is used for direct solver coupled with a discrete adjoint approach for sensitivity analysis and method of moving asymptotes for numerical optimization. Several examples are presented to show the ability of the method to find a trade-off between minimization of power dissipation and maximization of heat transfer while ensuring the separation and continuity of the channel of each fluid without crossing or mixing the fluids. The main conclusion is the possibility to find an optimal bi-fluid domain using topology optimization, defining a fluid to fluid heat exchanger device.Keywords: topology optimization, density approach, bi-fluid domain, laminar steady state regime, fluid-to-fluid heat exchanger
Procedia PDF Downloads 39912215 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement
Procedia PDF Downloads 9412214 Scalable and Accurate Detection of Pathogens from Whole-Genome Shotgun Sequencing
Authors: Janos Juhasz, Sandor Pongor, Balazs Ligeti
Abstract:
Next-generation sequencing, especially whole genome shotgun sequencing, is becoming a common approach to gain insight into the microbiomes in a culture-independent way, even in clinical practice. It does not only give us information about the species composition of an environmental sample but opens the possibility to detect antimicrobial resistance and novel, or currently unknown, pathogens. Accurately and reliably detecting the microbial strains is a challenging task. Here we present a sensitive approach for detecting pathogens in metagenomics samples with special regard to detecting novel variants of known pathogens. We have developed a pipeline that uses fast, short read aligner programs (i.e., Bowtie2/BWA) and comprehensive nucleotide databases. Taxonomic binning is based on the lowest common ancestor (LCA) principle; each read is assigned to a taxon, covering the most significantly hit taxa. This approach helps in balancing between sensitivity and running time. The program was tested both on experimental and synthetic data. The results implicate that our method performs as good as the state-of-the-art BLAST-based ones, furthermore, in some cases, it even proves to be better, while running two orders magnitude faster. It is sensitive and capable of identifying taxa being present only in small abundance. Moreover, it needs two orders of magnitude less reads to complete the identification than MetaPhLan2 does. We analyzed an experimental anthrax dataset (B. anthracis strain BA104). The majority of the reads (96.50%) was classified as Bacillus anthracis, a small portion, 1.2%, was classified as other species from the Bacillus genus. We demonstrate that the evaluation of high-throughput sequencing data is feasible in a reasonable time with good classification accuracy.Keywords: metagenomics, taxonomy binning, pathogens, microbiome, B. anthracis
Procedia PDF Downloads 13712213 A Risk Assessment Tool for the Contamination of Aflatoxins on Dried Figs Based on Machine Learning Algorithms
Authors: Kottaridi Klimentia, Demopoulos Vasilis, Sidiropoulos Anastasios, Ihara Diego, Nikolaidis Vasileios, Antonopoulos Dimitrios
Abstract:
Aflatoxins are highly poisonous and carcinogenic compounds produced by species of the genus Aspergillus spp. that can infect a variety of agricultural foods, including dried figs. Biological and environmental factors, such as population, pathogenicity, and aflatoxinogenic capacity of the strains, topography, soil, and climate parameters of the fig orchards, are believed to have a strong effect on aflatoxin levels. Existing methods for aflatoxin detection and measurement, such as high performance liquid chromatography (HPLC), and enzyme-linked immunosorbent assay (ELISA), can provide accurate results, but the procedures are usually time-consuming, sample-destructive, and expensive. Predicting aflatoxin levels prior to crop harvest is useful for minimizing the health and financial impact of a contaminated crop. Consequently, there is interest in developing a tool that predicts aflatoxin levels based on topography and soil analysis data of fig orchards. This paper describes the development of a risk assessment tool for the contamination of aflatoxin on dried figs, based on the location and altitude of the fig orchards, the population of the fungus Aspergillus spp. in the soil, and soil parameters such as pH, saturation percentage (SP), electrical conductivity (EC), organic matter, particle size analysis (sand, silt, clay), the concentration of the exchangeable cations (Ca, Mg, K, Na), extractable P, and trace of elements (B, Fe, Mn, Zn and Cu), by employing machine learning methods. In particular, our proposed method integrates three machine learning techniques, i.e., dimensionality reduction on the original dataset (principal component analysis), metric learning (Mahalanobis metric for clustering), and k-nearest neighbors learning algorithm (KNN), into an enhanced model, with mean performance equal to 85% by terms of the Pearson correlation coefficient (PCC) between observed and predicted values.Keywords: aflatoxins, Aspergillus spp., dried figs, k-nearest neighbors, machine learning, prediction
Procedia PDF Downloads 18412212 The Approach of Male and Female Spectators about the Presence of Female Spectators in Sport Stadiums of Iran
Authors: Mohammad Reza Boroumand Devlagh, Seyed Mohammad Hosein Razavi, Fatemeh Ahmadi, Azam Fazli Darzi
Abstract:
The issue of female presence in Iran stadiums has long been considered and debated by governmental experts and authorities, however, no conclusion is yielded yet. Thus, the present study has been done with the aim of investigating the approach of male and female spectators about the presence of female spectators in Iranian stadiums. The statistical population of the study includes all male and female spectators who have not experienced the live watching of male championship matches in stadiums. 224 subjects from the statistical population have selected through stratified random sampling as the sample of the study. For data collection, researcher-made questionnaire has been used whose validity has been confirmed by the university professors and its reliability has been studied and confirmed through an preliminary study. (r= 0.81). Data analysis has been done using descriptive and referential statistics in P< 0.05. The results of the study showed that male and female were meaningfully agreed with the female presence in stadiums and there is no meaningful difference between male and female approaches concerning the female spectators’ presence in sport stadiums of Iran (sig= 0.867).Keywords: male, female spectators, Iran, sport stadiums, population
Procedia PDF Downloads 54712211 Enhancing Rural Agricultural Value Chains through Electric Mobility Services in Ethiopia
Authors: Clemens Pizzinini, Philipp Rosner, David Ziegler, Markus Lienkamp
Abstract:
Transportation is a constitutional part of most supply and value chains in modern economies. Smallholder farmers in rural Ethiopia face severe challenges along their supply and value chains. In particular, suitable, affordable, and available transport services are in high demand. To develop a context-specific technical solutions, a problem-to-solution methodology based on the interaction with technology is developed. With this approach, we fill the gap between proven transportation assessment frameworks and general user-centered techniques. Central to our approach is an electric test vehicle that is implemented in rural supply and value chains for research, development, and testing. Based on our objective and the derived methodological requirements, a set of existing methods is selected. Local partners are integrated into an organizational framework that executes major parts of this research endeavour in the Arsi Zone, Oromia Region, Ethiopia.Keywords: agricultural value chain, participatory methods, agile methods, sub-Saharan Africa, Ethiopia, electric vehicle, transport service
Procedia PDF Downloads 7412210 Syntactic Ambiguity and Syntactic Analysis: Transformational Grammar Approach
Authors: Olufemi Olupe
Abstract:
Within linguistics, various approaches have been adopted to the study of language. One of such approaches is the syntax. The syntax is an aspect of the grammar of the language which deals with how words are put together to form phrases and sentences and how such structures are interpreted in language. Ambiguity, which is also germane in this discourse is about the uncertainty of meaning as a result of the possibility of a phrase or sentence being understood and interpreted in more than one way. In the light of the above, this paper attempts a syntactic study of syntactic ambiguities in The English Language, using the Transformational Generative Grammar (TGG) Approach. In doing this, phrases and sentences were raised with each description followed by relevant analysis. Finding in the work reveals that ambiguity cannot always be disambiguated by the means of syntactic analysis alone without recourse to semantic interpretation. The further finding shows that some syntactical ambiguities structures cannot be analysed on two surface structures in spite of the fact that there are more than one deep structures. The paper concludes that in as much as ambiguity remains in language; it will continue to pose a problem of understanding to a second language learner. Users of English as a second language, must, however, make a conscious effort to avoid its usage to achieve effective communication.Keywords: language, syntax, semantics, morphology, ambiguity
Procedia PDF Downloads 39412209 Political Economy and Human Rights Engaging in Conversation
Authors: Manuel Branco
Abstract:
This paper argues that mainstream economics is one of the reasons that can explain the difficulty in fully realizing human rights because its logic is intrinsically contradictory to human rights, most especially economic, social and cultural rights. First, its utilitarianism, both in its cardinal and ordinal understanding, contradicts human rights principles. Maximizing aggregate utility along the lines of cardinal utility is a theoretical exercise that consists in ensuring as much as possible that gains outweigh losses in society. In this process an individual may get worse off, though. If mainstream logic is comfortable with this, human rights' logic does not. Indeed, universality is a key principle in human rights and for this reason the maximization exercise should aim at satisfying all citizens’ requests when goods and services necessary to secure human rights are at stake. The ordinal version of utilitarianism, in turn, contradicts the human rights principle of indivisibility. Contrary to ordinal utility theory that ranks baskets of goods, human rights do not accept ranking when these goods and services are necessary to secure human rights. Second, by relying preferably on market logic to allocate goods and services, mainstream economics contradicts human rights because the intermediation of money prices and the purpose of profit may cause exclusion, thus compromising the principle of universality. Finally, mainstream economics sees human rights mainly as constraints to the development of its logic. According to this view securing human rights would, then, be considered a cost weighing on economic efficiency and, therefore, something to be minimized. Fully realizing human rights needs, therefore, a different approach. This paper discusses a human rights-based political economy. This political economy, among other characteristics should give up mainstream economics narrow utilitarian approach, give up its belief that market logic should guide all exchanges of goods and services between human beings, and finally give up its view of human rights as constraints on rational choice and consequently on good economic performance. Giving up mainstream’s narrow utilitarian approach means, first embracing procedural utility and human rights-aimed consequentialism. Second, a more radical break can be imagined; non-utilitarian, or even anti-utilitarian, approaches may emerge, then, as alternatives, these two standpoints being not necessarily mutually exclusive, though. Giving up market exclusivity means embracing decommodification. More specifically, this means an approach that takes into consideration the value produced outside the market and an allocation process no longer necessarily centered on money prices. Giving up the view of human rights as constraints means, finally, to consider human rights as an expression of wellbeing and a manifestation of choice. This means, in turn, an approach that uses indicators of economic performance other than growth at the macro level and profit at the micro level, because what we measure affects what we do.Keywords: economic and social rights, political economy, economic theory, markets
Procedia PDF Downloads 15212208 Matrix Method Posting
Authors: Varong Pongsai
Abstract:
The objective of this paper is introducing a new method of accounting posting which is called Matrix Method Posting. This method is based on the Matrix operation of pure Mathematics. Although, accounting field is classified as one of the social-science knowledge, many of accounting operations are placed by Mathematics sign and operation. Through the operation applying, it seems to be that the operations of Mathematics should be applied to accounting possibly. So, this paper tries to over-lap Mathematics logic to accounting logic smoothly. According to the context of discovery, deductive approach is employed to prove a simultaneously logical concept of both Mathematics and Accounting. The result proves that the Matrix can be placed to operate accounting perfectly, because Matrix and accounting logic also have a similarity concept which is balancing 2 sides during operations. Moreover, the Matrix posting also has a lot of benefit. It can help financial analyst calculating financial ratios comfortably. Furthermore, the matrix determinant which is a signature operation itself also helps auditors checking out the correction of clients’ recording. If the determinant is not equaled to 0, it will point out that the recording process of clients getting into the problem. Finally, the Matrix should be easily determining a concept of merger and consolidation far beyond the present day concept.Keywords: matrix method posting, deductive approach, determinant, accounting application
Procedia PDF Downloads 36712207 A Parallel Implementation of Artificial Bee Colony Algorithm within CUDA Architecture
Authors: Selcuk Aslan, Dervis Karaboga, Celal Ozturk
Abstract:
Artificial Bee Colony (ABC) algorithm is one of the most successful swarm intelligence based metaheuristics. It has been applied to a number of constrained or unconstrained numerical and combinatorial optimization problems. In this paper, we presented a parallelized version of ABC algorithm by adapting employed and onlooker bee phases to the Compute Unified Device Architecture (CUDA) platform which is a graphical processing unit (GPU) programming environment by NVIDIA. The execution speed and obtained results of the proposed approach and sequential version of ABC algorithm are compared on functions that are typically used as benchmarks for optimization algorithms. Tests on standard benchmark functions with different colony size and number of parameters showed that proposed parallelization approach for ABC algorithm decreases the execution time consumed by the employed and onlooker bee phases in total and achieved similar or better quality of the results compared to the standard sequential implementation of the ABC algorithm.Keywords: Artificial Bee Colony algorithm, GPU computing, swarm intelligence, parallelization
Procedia PDF Downloads 37812206 PID Sliding Mode Control with Sliding Surface Dynamics based Continuous Control Action for Robotic Systems
Authors: Wael M. Elawady, Mohamed F. Asar, Amany M. Sarhan
Abstract:
This paper adopts a continuous sliding mode control scheme for trajectory tracking control of robot manipulators with structured and unstructured uncertain dynamics and external disturbances. In this algorithm, the equivalent control in the conventional sliding mode control is replaced by a PID control action. Moreover, the discontinuous switching control signal is replaced by a continuous proportional-integral (PI) control term such that the implementation of the proposed control algorithm does not require the prior knowledge of the bounds of unknown uncertainties and external disturbances and completely eliminates the chattering phenomenon of the conventional sliding mode control approach. The closed-loop system with the adopted control algorithm has been proved to be globally stable by using Lyapunov stability theory. Numerical simulations using the dynamical model of robot manipulators with modeling uncertainties demonstrate the superiority and effectiveness of the proposed approach in high speed trajectory tracking problems.Keywords: PID, robot, sliding mode control, uncertainties
Procedia PDF Downloads 50812205 Predicting Machine-Down of Woodworking Industrial Machines
Authors: Matteo Calabrese, Martin Cimmino, Dimos Kapetis, Martina Manfrin, Donato Concilio, Giuseppe Toscano, Giovanni Ciandrini, Giancarlo Paccapeli, Gianluca Giarratana, Marco Siciliano, Andrea Forlani, Alberto Carrotta
Abstract:
In this paper we describe a machine learning methodology for Predictive Maintenance (PdM) applied on woodworking industrial machines. PdM is a prominent strategy consisting of all the operational techniques and actions required to ensure machine availability and to prevent a machine-down failure. One of the challenges with PdM approach is to design and develop of an embedded smart system to enable the health status of the machine. The proposed approach allows screening simultaneously multiple connected machines, thus providing real-time monitoring that can be adopted with maintenance management. This is achieved by applying temporal feature engineering techniques and training an ensemble of classification algorithms to predict Remaining Useful Lifetime of woodworking machines. The effectiveness of the methodology is demonstrated by testing an independent sample of additional woodworking machines without presenting machine down event.Keywords: predictive maintenance, machine learning, connected machines, artificial intelligence
Procedia PDF Downloads 22612204 From Problem Space to Executional Architecture: The Development of a Simulator to Examine the Effect of Autonomy on Mainline Rail Capacity
Authors: Emily J. Morey, Kevin Galvin, Thomas Riley, R. Eddie Wilson
Abstract:
The key challenges faced by integrating autonomous rail operations into the existing mainline railway environment have been identified through the understanding and framing of the problem space and stakeholder analysis. This was achieved through the completion of the first four steps of Soft Systems Methodology, where the problem space has been expressed via conceptual models. Having identified these challenges, we investigated one of them, namely capacity, via the use of models and simulation. This paper examines the approach used to move from the conceptual models to a simulation which can determine whether the integration of autonomous trains can plausibly increase capacity. Within this approach, we developed an architecture and converted logical models into physical resource models and associated design features which were used to build a simulator. From this simulator, we are able to analyse mixtures of legacy-autonomous operations and produce fundamental diagrams and trajectory plots to describe the dynamic behaviour of mixed mainline railway operations.Keywords: autonomy, executable architecture, modelling and simulation, railway capacity
Procedia PDF Downloads 83