Search results for: finite element approach
12710 An Experimental Approach to the Influence of Tipping Points and Scientific Uncertainties in the Success of International Fisheries Management
Authors: Jules Selles
Abstract:
The Atlantic and Mediterranean bluefin tuna fishery have been considered as the archetype of an overfished and mismanaged fishery. This crisis has demonstrated the role of public awareness and the importance of the interactions between science and management about scientific uncertainties. This work aims at investigating the policy making process associated with a regional fisheries management organization. We propose a contextualized computer-based experimental approach, in order to explore the effects of key factors on the cooperation process in a complex straddling stock management setting. Namely, we analyze the effects of the introduction of a socio-economic tipping point and the uncertainty surrounding the estimation of the resource level. Our approach is based on a Gordon-Schaefer bio-economic model which explicitly represents the decision making process. Each participant plays the role of a stakeholder of ICCAT and represents a coalition of fishing nations involved in the fishery and decide unilaterally a harvest policy for the coming year. The context of the experiment induces the incentives for exploitation and collaboration to achieve common sustainable harvest plans at the Atlantic bluefin tuna stock scale. Our rigorous framework allows testing how stakeholders who plan the exploitation of a fish stock (a common pool resource) respond to two kinds of effects: i) the inclusion of a drastic shift in the management constraints (beyond a socio-economic tipping point) and ii) an increasing uncertainty in the scientific estimation of the resource level.Keywords: economic experiment, fisheries management, game theory, policy making, Atlantic Bluefin tuna
Procedia PDF Downloads 25312709 The Current Ways of Thinking Mild Traumatic Brain Injury and Clinical Practice in a Trauma Hospital: A Pilot Study
Authors: P. Donnelly, G. Mitchell
Abstract:
Traumatic Brain Injury (TBI) is a major contributor to the global burden of disease; despite its ubiquity, there is significant variation in diagnosis, prognosis, and treatment between clinicians. This study aims to examine the spectrum of approaches that currently exist at a Level 1 Trauma Centre in Australasia by surveying Emergency Physicians and Neurosurgeons on those aspects of mTBI. A pilot survey of 17 clinicians (Neurosurgeons, Emergency Physicians, and others who manage patients with mTBI) at a Level 1 Trauma Centre in Brisbane, Australia, was conducted. The objective of this study was to examine the importance these clinicians place on various elements in their approach to the diagnosis, prognostication, and treatment of mTBI. The data were summarised, and the descriptive statistics reported. Loss of consciousness and post-traumatic amnesia were rated as the most important signs or symptoms in diagnosing mTBI (median importance of 8). MRI was the most important imaging modality in diagnosing mTBI (median importance of 7). ‘Number of the Previous TBIs’ and Intracranial Injury on Imaging’ were rated as the most important elements for prognostication (median importance of 9). Education and reassurance were rated as the most important modality for treating mTBI (median importance of 7). There was a statistically insignificant variation between the specialties as to the importance they place on each of these components. In this Australian tertiary trauma center, there appears to be variation in how clinicians approach mTBI. This study is underpowered to state whether this is between clinicians within a specialty or a trend between specialties. This variation is worthwhile in investigating as a step toward a unified approach to diagnosing, prognosticating, and treating this common pathology.Keywords: mild traumatic brain injury, adult, clinician, survey
Procedia PDF Downloads 13012708 Developing Heat-Power Efficiency Criteria for Characterization of Technosphere Structural Elements
Authors: Victoria Y. Garnova, Vladimir G. Merzlikin, Sergey V. Khudyakov, Aleksandr A. Gajour, Andrei P. Garnov
Abstract:
This paper refers to the analysis of the characteristics of industrial and lifestyle facilities heat- energy objects as a part of the thermal envelope of Earth's surface for inclusion in any database of economic forecasting. The idealized model of the Earth's surface is discussed. This model gives the opportunity to obtain the energy equivalent for each element of terrain and world ocean. Energy efficiency criterion of comfortable human existence is introduced. Dynamics of changes of this criterion offers the possibility to simulate the possible technogenic catastrophes with a spontaneous industrial development of the certain Earth areas. Calculated model with the confirmed forecast of the Gulf Stream freezing in the Polar Regions in 2011 due to the heat-energy balance disturbance for the oceanic subsurface oil polluted layer is given. Two opposing trends of human development under the limited and unlimited amount of heat-energy resources are analyzed.Keywords: Earth's surface, heat-energy consumption, energy criteria, technogenic catastrophes
Procedia PDF Downloads 32312707 Recursive Doubly Complementary Filter Design Using Particle Swarm Optimization
Authors: Ju-Hong Lee, Ding-Chen Chung
Abstract:
This paper deals with the optimal design of recursive doubly complementary (DC) digital filter design using a metaheuristic based optimization technique. Based on the theory of DC digital filters using two recursive digital all-pass filters (DAFs), the design problem is appropriately formulated to result in an objective function which is a weighted sum of the phase response errors of the designed DAFs. To deal with the stability of the recursive DC filters during the design process, we can either impose some necessary constraints on the phases of the recursive DAFs. Through a frequency sampling and a weighted least squares approach, the optimization problem of the objective function can be solved by utilizing a population based stochastic optimization approach. The resulting DC digital filters can possess satisfactory frequency response. Simulation results are presented for illustration and comparison.Keywords: doubly complementary, digital all-pass filter, weighted least squares algorithm, particle swarm optimization
Procedia PDF Downloads 68812706 A Framework for Successful TQM Implementation and Its Effect on the Organizational Sustainability Development
Authors: Redha Elhuni, M. Munir Ahmad
Abstract:
The main purpose of this research is to construct a generic model for successful implementation of Total Quality Management (TQM) in oil sector, and to find out the effects of this model on the organizational sustainability development (OSD) performance of Libyan oil and gas companies using the structured equation modeling (SEM) approach. The research approach covers both quantitative and qualitative methods. A questionnaire was developed in order to identify the quality factors that are seen by Libyan oil and gas companies to be critical to the success of TQM implementation. Hypotheses were developed to evaluate the impact of TQM implementation on O SD. Data analysis reveals that there is a significant positive effect of the TQM implementation on OSD. 24 quality factors are found to be critical and absolutely essential for successful TQM implementation. The results generated a structure of the TQMSD implementation framework based on the four major road map constructs (Top management commitment, employee involvement and participation, customer-driven processes, and continuous improvement culture).Keywords: total quality management, critical success factors, oil and gas, organizational sustainability development (SD), Libya
Procedia PDF Downloads 27412705 Digital Control Algorithm Based on Delta-Operator for High-Frequency DC-DC Switching Converters
Authors: Renkai Wang, Tingcun Wei
Abstract:
In this paper, a digital control algorithm based on delta-operator is presented for high-frequency digitally-controlled DC-DC switching converters. The stability and the controlling accuracy of the DC-DC switching converters are improved by using the digital control algorithm based on delta-operator without increasing the hardware circuit scale. The design method of voltage compensator in delta-domain using PID (Proportion-Integration- Differentiation) control is given in this paper, and the simulation results based on Simulink platform are provided, which have verified the theoretical analysis results very well. It can be concluded that, the presented control algorithm based on delta-operator has better stability and controlling accuracy, and easier hardware implementation than the existed control algorithms based on z-operator, therefore it can be used for the voltage compensator design in high-frequency digitally- controlled DC-DC switching converters.Keywords: digitally-controlled DC-DC switching converter, digital voltage compensator, delta-operator, finite word length, stability
Procedia PDF Downloads 41212704 Immiscible Polymer Blends with Controlled Nanoparticle Location for Excellent Microwave Absorption: A Compartmentalized Approach
Authors: Sourav Biswas, Goutam Prasanna Kar, Suryasarathi Bose
Abstract:
In order to obtain better materials, control in the precise location of nanoparticles is indispensable. It was shown here that ordered arrangement of nanoparticles, possessing different characteristics (electrical/magnetic dipoles), in the blend structure can result in excellent microwave absorption. This is manifested from a high reflection loss of ca. -67 dB for the best blend structure designed here. To attenuate electromagnetic radiations, the key parameters i.e. high electrical conductivity and large dielectric/magnetic loss are targeted here using a conducting inclusion [multiwall carbon nanotubes, MWNTs]; ferroelectric nanostructured material with associated relaxations in the GHz frequency [barium titanate, BT]; and a loss ferromagnetic nanoparticles [nickel ferrite, NF]. In this study, bi-continuous structures were designed using 50/50 (by wt) blends of polycarbonate (PC) and polyvinylidene fluoride (PVDF). The MWNTs was modified using an electron acceptor molecule; a derivative of perylenediimide, which facilitates π-π stacking with the nanotubes and stimulates efficient charge transport in the blends. The nanoscopic materials have specific affinity towards the PVDF phase. Hence, by introducing surface-active groups, ordered arrangement can be tailored. To accomplish this, both BT and NF was first hydroxylated followed by introducing amine-terminal groups on the surface. The latter facilitated in nucleophilic substitution reaction with PC and resulted in their precise location. In this study, we have shown for the first time that by compartmentalized approach, superior EM attenuation can be achieved. For instance, when the nanoparticles were localized exclusively in the PVDF phase or in both the phases, the minimum reflection loss was ca. -18 dB (for MWNT/BT mixture) and -29 dB (for MWNT/NF mixture), and the shielding was primarily through reflection. Interestingly, by adopting the compartmentalized approach where in, the lossy materials were in the PC phase and the conducting inclusion (MWNT) in PVDF, an outstanding reflection loss of ca. -57 dB (for BT and MWNT combination) and -67 dB (for NF and MWNT combination) was noted and the shielding was primarily through absorption. Thus, the approach demonstrates that nanoscopic structuring in the blends can be achieved under macroscopic processing conditions and this strategy can further be explored to design microwave absorbers.Keywords: barium titanate, EMI shielding, MWNTs, nickel ferrite
Procedia PDF Downloads 44812703 An Abductive Approach to Policy Analysis: Policy Analysis as Informed Guessing
Authors: Adrian W. Chew
Abstract:
This paper argues that education policy analysis tends to be steered towards empiricist oriented approaches, which place emphasis on objective and measurable data. However, this paper argues that empiricist oriented approaches are generally based on inductive and/or deductive reasoning, which are unable to generate new ideas/knowledge. This paper will outline the logical structure of induction, deduction, and abduction, and argues that only abduction provides possibilities for the creation of new ideas/knowledge. This paper proposes the neologism of ‘informed guessing’ as a reformulation of abduction, and also as an approach to education policy analysis. On one side, the signifier ‘informed’ encapsulates the idea that abductive policy analysis needs to be informed by descriptive conceptualization theory to be able to make relations and connections between, and within, observed phenomenon and unobservable general structures. On the other side, the signifier ‘guessing’ captures the cyclical and unsystematic process of abduction. This paper will end with a brief example of utilising ‘informed guessing’ for a policy analysis of school choice lotteries in the United States.Keywords: abductive reasoning, empiricism, informed guessing, policy analysis
Procedia PDF Downloads 35312702 Analyses and Optimization of Physical and Mechanical Properties of Direct Recycled Aluminium Alloy (AA6061) Wastes by ANOVA Approach
Authors: Mohammed H. Rady, Mohd Sukri Mustapa, S Shamsudin, M. A. Lajis, A. Wagiman
Abstract:
The present study is aimed at investigating microhardness and density of aluminium alloy chips when subjected to various settings of preheating temperature and preheating time. Three values of preheating temperature were taken as 450 °C, 500 °C, and 550 °C. On the other hand, three values of preheating time were chosen (1, 2, 3) hours. The influences of the process parameters (preheating temperature and time) were analyzed using Design of Experiments (DOE) approach whereby full factorial design with center point analysis was adopted. The total runs were 11 and they comprise of two factors of full factorial design with 3 center points. The responses were microhardness and density. The results showed that the density and microhardness increased with decreasing the preheating temperature. The results also found that the preheating temperature is more important to be controlled rather than the preheating time in microhardness analysis while both the preheating temperature and preheating time are important in density analysis. It can be concluded that setting temperature at 450 °C for 1 hour resulted in the optimum responses.Keywords: AA6061, density, DOE, hot extrusion, microhardness
Procedia PDF Downloads 35012701 Analysis of Fault Tolerance on Grid Computing in Real Time Approach
Authors: Parampal Kaur, Deepak Aggarwal
Abstract:
In the computational Grid, fault tolerance is an imperative issue to be considered during job scheduling. Due to the widespread use of resources, systems are highly prone to errors and failures. Hence, fault tolerance plays a key role in the grid to avoid the problem of unreliability. Scheduling the task to the appropriate resource is a vital requirement in computational Grid. The fittest resource scheduling algorithm searches for the appropriate resource based on the job requirements, in contrary to the general scheduling algorithms where jobs are scheduled to the resources with best performance factor. The proposed method is to improve the fault tolerance of the fittest resource scheduling algorithm by scheduling the job in coordination with job replication when the resource has low reliability. Based on the reliability index of the resource, the resource is identified as critical. The tasks are scheduled based on the criticality of the resources. Results show that the execution time of the tasks is comparatively reduced with the proposed algorithm using real-time approach rather than a simulator.Keywords: computational grid, fault tolerance, task replication, job scheduling
Procedia PDF Downloads 43612700 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach
Authors: D. Tedesco, G. Feletti, P. Trucco
Abstract:
The present study aims to develop a Decision Support System (DSS) to support the operational decision of the Emergency Medical Service (EMS) regarding the assignment of medical emergency requests to Emergency Departments (ED). In the literature, this problem is also known as “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies are mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a request. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the transport time and release the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs, considering information relating to the subsequent phases of the process, such as the case-mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to evaluate different hospital selection policies. Therefore, the next steps of the research consisted of the development of a general simulation architecture, its implementation in the AnyLogic software and its validation on a realistic dataset. The hospital selection policy that produced the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, which is based on a retrospective estimate of the TTP, and a dynamic approach, which is based on a predictive estimate of the TTP determined with a constantly updated Winters model. Findings reveal that considering the minimization of TTP as a hospital selection policy raises several benefits. It allows to significantly reduce service throughput times in the ED with a minimum increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case-mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms of TTP estimation than a retrospective approach but entails a more difficult application. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.Keywords: discrete event simulation, emergency medical services, forecast model, hospital selection
Procedia PDF Downloads 9112699 Software User Experience Enhancement through User-Centered Design and Co-design Approach
Authors: Shan Wang, Fahad Alhathal, Hari Subramanian
Abstract:
User-centered design skills play an important role in crafting a positive and intuitive user experience for software applications. Embracing a user-centric design approach involves understanding the needs, preferences, and behaviors of the end-users throughout the design process. This mindset not only enhances the usability of the software but also fosters a deeper connection between the digital product and its users. This paper encompasses a 6-month knowledge exchange collaboration project between an academic institution and an external industry in 2023 in the UK; it aims to improve the user experience of a digital platform utilized for a knowledge management tool, to understand users' preferences for features, identify sources of frustration, and pinpoint areas for enhancement. This research conducted one of the most effective methods to implement user-centered design through co-design workshops for testing user onboarding experiences that involve the active participation of users in the design process. More specifically, in January 2023, we organized eight co-design workshops with a diverse group of 11 individuals. Throughout these co-design workshops, we accumulated a total of 11 hours of qualitative data in both video and audio formats. Subsequently, we conducted an analysis of user journeys, identifying common issues and potential areas for improvement within three insights. This analysis was pivotal in guiding the knowledge management software in prioritizing feature enhancements and design improvements. Employing a user-centered design thinking process, we developed a series of graphic design solutions in collaboration with the software management tool company. These solutions were targeted at refining onboarding user experiences, workplace interfaces, and interactive design. Some of these design solutions were translated into tangible interfaces for the knowledge management tool. By actively involving users in the design process and valuing their input, developers can create products that are not only functional but also resonate with the end-users, ultimately leading to greater success in the competitive software landscape. In conclusion, this paper not only contributes insights into designing onboarding user experiences for software within a co-design approach but also presents key theories on leveraging the user-centered design process in software design to enhance overall user experiences.Keywords: user experiences design, user centered design, co-design approach, knowledge management tool
Procedia PDF Downloads 1012698 Money Laundering Risk Assessment in the Banking Institutions: An Experimental Approach
Authors: Yusarina Mat-Isa, Zuraidah Mohd-Sanusi, Mohd-Nizal Haniff, Paul A. Barnes
Abstract:
In view that money laundering has become eminent for banking institutions, it is an obligation for the banking institutions to adopt a risk-based approach as the integral component of the accepted policies on anti-money laundering. In doing so, those involved with the banking operations are the most critical group of personnel as these are the people who deal with the day-to-day operations of the banking institutions and are obligated to form a judgement on the level of impending risk. This requirement is extended to all relevant banking institutions staff, such as tellers and customer account representatives for them to identify suspicious customers and escalate it to the relevant authorities. Banking institutions staffs, however, face enormous challenges in identifying and distinguishing money launderers from other legitimate customers seeking genuine banking transactions. Banking institutions staffs are mostly educated and trained with the business objective in mind to serve the customers and are not trained to be “detectives with a detective’s power of observation”. Despite increasing awareness as well as trainings conducted for the banking institutions staff, their competency in assessing money laundering risk is still insufficient. Several gaps have prompted this study including the lack of behavioural perspectives in the assessment of money laundering risk in the banking institutions. Utilizing experimental approach, respondents are randomly assigned within a controlled setting with manipulated situations upon which judgement of the respondents is solicited based on various observations related to the situations. The study suggests that it is imperative that informed judgement is exercised in arriving at the decision to proceed with the banking services required by the customers. Judgement forms a basis of opinion for the banking institution staff to decide if the customers posed money laundering risk. Failure to exercise good judgement could results in losses and absorption of unnecessary risk into the banking institutions. Although the banking institutions are exposed with choices of automated solutions in assessing money laundering risk, the human factor in assessing the risk is indispensable. Individual staff in the banking institutions is the first line of defence who are responsible for screening the impending risk of any customer soliciting for banking services. At the end of the spectrum, the individual role involvement on the subject of money laundering risk assessment is not a substitute for automated solutions as human judgement is inimitable.Keywords: banking institutions, experimental approach, money laundering, risk assessment
Procedia PDF Downloads 26712697 Numerical Analysis of the Flow Characteristics Around a Deformable Vortex Generator
Authors: Aimad Koulali
Abstract:
Flow structure evolution around a single pair of Delta vortex generators (VGs) is studied numerically. For laminar, transient, and turbulent flow regimes, numerical simulations have been performed in a duct with a pair of Delta vortex generators. The finiteelementmethodwasused to simulate the flow. To formulate the fluid structure interaction problem, the ALE formulation was used. The aim of this study is to provide a detailed insight into the generation and dissipation of longitudinal vortices over a wide range of flow regimes, including the laminar-turbulent transition. A wide range of parameters has been exploited to describe the inducedphenomenawithin the flow. Weexaminedvariousparametersdepending on the VG geometry, the flow regime, and the channel geometry. A detailed analysis of the turbulence and wall shear stress properties has been evaluated. The results affirm that there are still optimal values to obtain better performing vortices in order to improve the exchange performance.Keywords: finte element method, deformable vortex generator, numerical analysis, fluid structure interaction, ALE formlation, turbulent flow
Procedia PDF Downloads 9912696 Parameters Identification and Sensitivity Study for Abrasive WaterJet Milling Model
Authors: Didier Auroux, Vladimir Groza
Abstract:
This work is part of STEEP Marie-Curie ITN project, and it focuses on the identification of unknown parameters of the proposed generic Abrasive WaterJet Milling (AWJM) PDE model, that appears as an ill-posed inverse problem. The necessity of studying this problem comes from the industrial milling applications where the possibility to predict and model the final surface with high accuracy is one of the primary tasks in the absence of any knowledge of the model parameters that should be used. In this framework, we propose the identification of model parameters by minimizing a cost function, measuring the difference between experimental and numerical solutions. The adjoint approach based on corresponding Lagrangian gives the opportunity to find out the unknowns of the AWJM model and their optimal values that could be used to reproduce the required trench profile. Due to the complexity of the nonlinear problem and a large number of model parameters, we use an automatic differentiation software tool (TAPENADE) for the adjoint computations. By adding noise to the artificial data, we show that in fact the parameter identification problem is highly unstable and strictly depends on input measurements. Regularization terms could be effectively used to deal with the presence of data noise and to improve the identification correctness. Based on this approach we present results in 2D and 3D of the identification of the model parameters and of the surface prediction both with self-generated data and measurements obtained from the real production. Considering different types of model and measurement errors allows us to obtain acceptable results for manufacturing and to expect the proper identification of unknowns. This approach also gives us the ability to distribute the research on more complex cases and consider different types of model and measurement errors as well as 3D time-dependent model with variations of the jet feed speed.Keywords: Abrasive Waterjet Milling, inverse problem, model parameters identification, regularization
Procedia PDF Downloads 31612695 Disaggregating and Forecasting the Total Energy Consumption of a Building: A Case Study of a High Cooling Demand Facility
Authors: Juliana Barcelos Cordeiro, Khashayar Mahani, Farbod Farzan, Mohsen A. Jafari
Abstract:
Energy disaggregation has been focused by many energy companies since energy efficiency can be achieved when the breakdown of energy consumption is known. Companies have been investing in technologies to come up with software and/or hardware solutions that can provide this type of information to the consumer. On the other hand, not all people can afford to have these technologies. Therefore, in this paper, we present a methodology for breaking down the aggregate consumption and identifying the highdemanding end-uses profiles. These energy profiles will be used to build the forecast model for optimal control purpose. A facility with high cooling load is used as an illustrative case study to demonstrate the results of proposed methodology. We apply a high level energy disaggregation through a pattern recognition approach in order to extract the consumption profile of its rooftop packaged units (RTUs) and present a forecast model for the energy consumption.Keywords: energy consumption forecasting, energy efficiency, load disaggregation, pattern recognition approach
Procedia PDF Downloads 27812694 Establishment of the Regression Uncertainty of the Critical Heat Flux Power Correlation for an Advanced Fuel Bundle
Authors: L. Q. Yuan, J. Yang, A. Siddiqui
Abstract:
A new regression uncertainty analysis methodology was applied to determine the uncertainties of the critical heat flux (CHF) power correlation for an advanced 43-element bundle design, which was developed by Canadian Nuclear Laboratories (CNL) to achieve improved economics, resource utilization and energy sustainability. The new methodology is considered more appropriate than the traditional methodology in the assessment of the experimental uncertainty associated with regressions. The methodology was first assessed using both the Monte Carlo Method (MCM) and the Taylor Series Method (TSM) for a simple linear regression model, and then extended successfully to a non-linear CHF power regression model (CHF power as a function of inlet temperature, outlet pressure and mass flow rate). The regression uncertainty assessed by MCM agrees well with that by TSM. An equation to evaluate the CHF power regression uncertainty was developed and expressed as a function of independent variables that determine the CHF power.Keywords: CHF experiment, CHF correlation, regression uncertainty, Monte Carlo Method, Taylor Series Method
Procedia PDF Downloads 41612693 Using Heat-Mask in the Thermoforming Machine for Component Positioning in Thermoformed Electronics
Authors: Behnam Madadnia
Abstract:
For several years, 3D-shaped electronics have been rising, with many uses in home appliances, automotive, and manufacturing. One of the biggest challenges in the fabrication of 3D shape electronics, which are made by thermoforming, is repeatable and accurate component positioning, and typically there is no control over the final position of the component. This paper aims to address this issue and present a reliable approach for guiding the electronic components in the desired place during thermoforming. We have proposed a heat-control mask in the thermoforming machine to control the heating of the polymer, not allowing specific parts to be formable, which can assure the conductive traces' mechanical stability during thermoforming of the substrate. We have verified our approach's accuracy by applying our method on a real industrial semi-sphere mold for positioning 7 LEDs and one touch sensor. We measured the LEDs' position after thermoforming to prove the process's repeatability. The experiment results demonstrate that the proposed method is capable of positioning electronic components in thermoformed 3D electronics with high precision.Keywords: 3D-shaped electronics, electronic components, thermoforming, component positioning
Procedia PDF Downloads 9712692 Electro-Mechanical Response and Engineering Properties of Piezocomposite with Imperfect Interface
Authors: Rattanan Tippayaphalapholgul, Yasothorn Sapsathiarn
Abstract:
Composites of piezoelectric materials are widely use in practical applications such as nondestructive testing devices, smart adaptive structures and medical devices. A thorough understanding of coupled electro-elastic response and properties of piezocomposite are crucial for the development and design of piezoelectric composite materials used in advanced applications. The micromechanics analysis is employed in this paper to determine the response and engineering properties of the piezocomposite. A mechanical imperfect interface bonding between piezoelectric inclusion and polymer matrix is taken into consideration in the analysis. The micromechanics analysis is based on the Boundary Element Method (BEM) together with the periodic micro-field micromechanics theory. A selected set of numerical results is presented to investigate the influence of volume ratio and interface bonding condition on effective piezocomposite material coefficients and portray basic features of coupled electroelastic response within the domain of piezocomposite unit cell.Keywords: effective engineering properties, electroelastic response, imperfect interface, piezocomposite
Procedia PDF Downloads 23212691 A Hybrid Pareto-Based Swarm Optimization Algorithm for the Multi-Objective Flexible Job Shop Scheduling Problems
Authors: Aydin Teymourifar, Gurkan Ozturk
Abstract:
In this paper, a new hybrid particle swarm optimization algorithm is proposed for the multi-objective flexible job shop scheduling problem that is very important and hard combinatorial problem. The Pareto approach is used for solving the multi-objective problem. Several new local search heuristics are integrated into an algorithm based on the critical block concept to enhance the performance of the algorithm. The algorithm is compared with the recently published multi-objective algorithms based on benchmarks selected from the literature. Several metrics are used for quantifying performance and comparison of the achieved solutions. The algorithms are also compared based on the Weighting summation of objectives approach. The proposed algorithm can find the Pareto solutions more efficiently than the compared algorithms in less computational time.Keywords: swarm-based optimization, local search, Pareto optimality, flexible job shop scheduling, multi-objective optimization
Procedia PDF Downloads 36912690 A Formal Property Verification for Aspect-Oriented Programs in Software Development
Authors: Moustapha Bande, Hakima Ould-Slimane, Hanifa Boucheneb
Abstract:
Software development for complex systems requires efficient and automatic tools that can be used to verify the satisfiability of some critical properties such as security ones. With the emergence of Aspect-Oriented Programming (AOP), considerable work has been done in order to better modularize the separation of concerns in the software design and implementation. The goal is to prevent the cross-cutting concerns to be scattered across the multiple modules of the program and tangled with other modules. One of the key challenges in the aspect-oriented programs is to be sure that all the pieces put together at the weaving time ensure the satisfiability of the overall system requirements. Our paper focuses on this problem and proposes a formal property verification approach for a given property from the woven program. The approach is based on the control flow graph (CFG) of the woven program, and the use of a satisfiability modulo theories (SMT) solver to check whether each property (represented par one aspect) is satisfied or not once the weaving is done.Keywords: aspect-oriented programming, control flow graph, property verification, satisfiability modulo theories
Procedia PDF Downloads 17712689 Unconfined Strength of Nano Reactive Silica Sand Powder Concrete
Authors: Hossein Kabir, Mojtaba Sadeghi
Abstract:
Nowadays, high-strength concrete is an integral element of a variety of high-rise buildings. On the other hand, finding a suitable aggregate size distribution is a great concern; hence, the concrete mix proportion is presented that has no coarse aggregate, which still withstands enough desirable strength. Nano Reactive Silica sand powder concrete (NRSSPC) is a type of concrete with no coarse material in its own composition. In this concrete, the only aggregate found in the mix design is silica sand powder with a size less than 150 mm that is infinitesimally small regarding the normal concrete. The research aim is to find the compressive strength of this particular concrete under the applied different conditions of curing and consolidation to compare the approaches. In this study, the young concrete specimens were compacted with a pressing or vibrating process. It is worthwhile to mention that in order to show the influence of temperature in the curing process, the concrete specimen was cured either in 20 ⁰C lime water or autoclaved in 90 ⁰C oven.Keywords: reactive silica sand powder concrete (RSSPC), consolidation, compressive strength, normal curing, thermal accelerated curing
Procedia PDF Downloads 24812688 Applications of Nanoparticles via Laser Ablation in Liquids: A Review
Authors: Fawaz M. Abdullah, Abdulrahman M. Al-Ahmari, Madiha Rafaqat
Abstract:
Laser ablation of any solid target in the liquid leads to fabricate nanoparticles (NPs) with metal or different compositions of materials such as metals, alloys, oxides, carbides, hydroxides. The fabrication of NPs in liquids based on laser ablation has grown up rapidly in the last decades compared to other techniques. Nowadays, laser ablation has been improved to prepare different types of NPs with special morphologies, microstructures, phases, and sizes, which can be applied in various fields. The paper reviews and highlights the different sizes, shapes and application field of nanoparticles that are produced by laser ablation under different liquids and materials. Also, the paper provides a case study for producing a titanium NPs produced by laser ablation submerged in distilled water. The size of NPs is an important parameter, especially for their usage and applications. The size and shape have been analyzed by SEM, (EDAX) was applied to evaluate the oxidation and elements of titanium NPs and the XRD was used to evaluate the phase composition and the peaks of both titanium and some element. SEM technique showed that the synthesized NPs size ranges were between 15-35 nm which can be applied in various field such as annihilator for cancerous cell etc.Keywords: nanoparticles, laser ablation, titanium NPs, applications
Procedia PDF Downloads 14012687 The Differentiation of Performances among Immigrant Entrepreneurs: A Biographical Approach
Authors: Daniela Gnarini
Abstract:
This paper aims to contribute to the field of immigrants' entrepreneurial performance. The debate on immigrant entrepreneurship has been dominated by cultural explanations, which argue that immigrants’ entrepreneurial results are linked to groups’ characteristics. However, this approach does not consider important dimensions that influence entrepreneurial performances. Furthermore, cultural theories do not take into account the huge differences in performances also within the same ethnic group. For these reason, this study adopts a biographical approach, both at theoretical and at methodological level, which can allow to understand the main aspects that make the difference in immigrants' entrepreneurial performances, by exploring the narratives of immigrant entrepreneurs, who operate in the restaurant sector in two different Italian metropolitan areas: Milan and Rome. Through the qualitative method of biographical interviews, this study analyses four main dimensions and their combinations: a) individuals' entrepreneurial and migratory path: this aspect is particularly relevant to understand the biographical resources of immigrant entrepreneurs and their change and evolution during time; b) entrepreneurs' social capital, with a particular focus on their networks, through the adoption of a transnational perspective, that takes into account both the local level and the transnational connections. This study highlights that, though entrepreneurs’ connections are significant, especially as far as those with family members are concerned, often their entrepreneurial path assumes an individualised trajectory. c) Entrepreneurs' human capital, including both formal education and skills acquired through informal channels. The latter are particularly relevant since in the interviews and data collected the role of informal transmission emerges. d) Embeddedness within the social, political and economic context, to understand the main constraints and opportunities both at local and national level. The comparison between two different metropolitan areas within the same country helps to understand this dimension.Keywords: biographies, immigrant entrepreneurs, life stories, performance
Procedia PDF Downloads 22712686 Using a Quantitative Reasoning Framework to Help Students Understand Arc Measure Relationships
Authors: David Glassmeyer
Abstract:
Quantitative reasoning is necessary to robustly understand mathematical concepts ranging from elementary to university levels. Quantitative reasoning involves identifying and representing quantities and the relationships between these quantities. Without reasoning quantitatively, students often resort to memorizing formulas and procedures, which have negative impacts when they encounter mathematical topics in the future. This study investigated how high school students’ quantitative reasoning could be fostered within a unit on arc measure and angle relationships. Arc measure, or the measure of a central angle that cuts off a portion of a circle’s circumference, is often confused with arclength. In this study, the researcher redesigned an activity to clearly distinguish arc measure and arc length by using a quantitative reasoning framework. Data were collected from high school students to determine if this approach impacted their understanding of these concepts. Initial data indicates the approach was successful in supporting students’ quantitative reasoning of these topics. Implications for the work are that teachers themselves may also benefit from considering mathematical definitions from a quantitative reasoning framework and can use this activity in their own classrooms.Keywords: arc length, arc measure, quantitative reasoning, student content knowledge
Procedia PDF Downloads 25812685 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement
Procedia PDF Downloads 9412684 FengShui Paradigm as Philosophy of Sustainable Design
Authors: E. Erdogan, H. A. Erdogan
Abstract:
FengShui, an old Chinese discipline, dates back to more than 5000 years, is one of the design principles that aim at creating habitable and sustainable spaces in harmony with nature by systematizing data within its own structure. Having emerged from Chinese mysticism and embodying elements of faith in its principles, FengShui argues that the positive energy in the environment channels human behavior and psychology. This argument is supported with the thesis of quantum physics that ‘everything is made up of energy’ and gains an important place. In spaces where living and working take place with several principles and systematized rules, FengShui promises a happier, more peaceful and comfortable life by influencing human psychology, acts, and soul as well as the professional and social life of the individual. Observing these design properties in houses, workplaces, offices, the environment, and daily life as a design paradigm is significant. In this study, how FengShui, a Central Asian culture emanated from Chinese mysticism, shapes design and how it is used as an element of sustainable design will be explained.Keywords: Feng Shui, design principle, sustainability, philosophy
Procedia PDF Downloads 54212683 Scalable and Accurate Detection of Pathogens from Whole-Genome Shotgun Sequencing
Authors: Janos Juhasz, Sandor Pongor, Balazs Ligeti
Abstract:
Next-generation sequencing, especially whole genome shotgun sequencing, is becoming a common approach to gain insight into the microbiomes in a culture-independent way, even in clinical practice. It does not only give us information about the species composition of an environmental sample but opens the possibility to detect antimicrobial resistance and novel, or currently unknown, pathogens. Accurately and reliably detecting the microbial strains is a challenging task. Here we present a sensitive approach for detecting pathogens in metagenomics samples with special regard to detecting novel variants of known pathogens. We have developed a pipeline that uses fast, short read aligner programs (i.e., Bowtie2/BWA) and comprehensive nucleotide databases. Taxonomic binning is based on the lowest common ancestor (LCA) principle; each read is assigned to a taxon, covering the most significantly hit taxa. This approach helps in balancing between sensitivity and running time. The program was tested both on experimental and synthetic data. The results implicate that our method performs as good as the state-of-the-art BLAST-based ones, furthermore, in some cases, it even proves to be better, while running two orders magnitude faster. It is sensitive and capable of identifying taxa being present only in small abundance. Moreover, it needs two orders of magnitude less reads to complete the identification than MetaPhLan2 does. We analyzed an experimental anthrax dataset (B. anthracis strain BA104). The majority of the reads (96.50%) was classified as Bacillus anthracis, a small portion, 1.2%, was classified as other species from the Bacillus genus. We demonstrate that the evaluation of high-throughput sequencing data is feasible in a reasonable time with good classification accuracy.Keywords: metagenomics, taxonomy binning, pathogens, microbiome, B. anthracis
Procedia PDF Downloads 13712682 Screening of Osteoporosis in Aging Populations
Authors: Massimiliano Panella, Sara Bortoluzzi, Sophia Russotto, Daniele Nicolini, Carmela Rinaldi
Abstract:
Osteoporosis affects more than 200 million people worldwide. About 75% of osteoporosis cases are undiagnosed or diagnosed only when a bone fracture occurs. Since osteoporosis related fractures are significant determinants of the burden of disease and health and social costs of aging populations, we believe that this is the early identification and treatment of high-risk patients should be a priority in actual healthcare systems. Screening for osteoporosis by dual energy x-ray absorptiometry (DEXA) is not cost-effective for general population. An alternative is pulse-echo ultrasound (PEUS) because of the minor costs. To this end, we developed an early detection program for osteoporosis with PEUS, and we evaluated is possible impact and sustainability. We conducted a cross-sectional study including 1,050 people in Italy. Subjects with >1 major or >2 minor risk factors for osteoporosis were invited to PEUS bone mass density (BMD) measurement at the proximal tibia. Based on BMD values, subjects were classified as healthy subjects (BMD>0.783 g/cm²) and pathological including subjects with suspected osteopenia (0.783≤BMD>0.719 g/cm²) or osteoporosis (BMD ≤ 0.719 g/cm²). The responder rate was 60.4% (634/1050). According to the risk, PEUS scan was recommended to 436 people, of whom 300 (mean age 45.2, 81% women) accepted to participate. We identified 240 (80%) healthy and 60 (20%) pathological subjects (47 osteopenic and 13 osteoporotic). We observed a significant association between high risk people and reduced bone density (p=0.043) with increased risks for female gender, older ages, and menopause (p<0.01). The yearly cost of the screening program was 8,242 euros. With actual Italian fracture incidence rates in osteoporotic patients, we can reasonably expect in 20 years that at least 6 fractures will occur in our sample. If we consider that the mean costs per fracture in Italy is today 16,785 euros, we can estimate a theoretical cost of 100,710 euros. According to literature, we can assume that the early treatment of osteoporosis could avoid 24,170 euros of such costs. If we add the actual yearly cost of the treatments to the cost of our program and we compare this final amount of 11,682 euros to the avoidable costs of fractures (24,170 euros) we can measure a possible positive benefits/costs ratio of 2.07. As a major outcome, our study let us to early identify 60 people with a significant bone loss that were not aware of their condition. This diagnostic anticipation constitutes an important element of value for the project, both for the patients, for the preventable negative outcomes caused by the fractures, and for the society in general, because of the related avoidable costs. Therefore, based on our finding, we believe that the PEUS based screening performed could be a cost-effective approach to early identify osteoporosis. However, our study has some major limitations. In fact, in our study the economic analysis is based on theoretical scenarios, thus specific studies are needed for a better estimation of the possible benefits and costs of our program.Keywords: osteoporosis, prevention, public health, screening
Procedia PDF Downloads 12212681 The Approach of Male and Female Spectators about the Presence of Female Spectators in Sport Stadiums of Iran
Authors: Mohammad Reza Boroumand Devlagh, Seyed Mohammad Hosein Razavi, Fatemeh Ahmadi, Azam Fazli Darzi
Abstract:
The issue of female presence in Iran stadiums has long been considered and debated by governmental experts and authorities, however, no conclusion is yielded yet. Thus, the present study has been done with the aim of investigating the approach of male and female spectators about the presence of female spectators in Iranian stadiums. The statistical population of the study includes all male and female spectators who have not experienced the live watching of male championship matches in stadiums. 224 subjects from the statistical population have selected through stratified random sampling as the sample of the study. For data collection, researcher-made questionnaire has been used whose validity has been confirmed by the university professors and its reliability has been studied and confirmed through an preliminary study. (r= 0.81). Data analysis has been done using descriptive and referential statistics in P< 0.05. The results of the study showed that male and female were meaningfully agreed with the female presence in stadiums and there is no meaningful difference between male and female approaches concerning the female spectators’ presence in sport stadiums of Iran (sig= 0.867).Keywords: male, female spectators, Iran, sport stadiums, population
Procedia PDF Downloads 548