Search results for: variation approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15675

Search results for: variation approach

12945 Instability of H2-O2-CO2 Premixed Flames on Flat Burner

Authors: Kaewpradap Amornrat, Endo Takahiro, Kadowaki Satoshi

Abstract:

The combustion of hydrogen-oxygen (H2-O2) mixtures was investigated to consider the reduction of carbon dioxide (CO2) and nitrogen oxide (NOx) as the greenhouse emission. Normally, the flame speed of combustion H2-O2 mixtures are very fast thus it is necessary to control the limit of mixtures with CO2 addition as H2-O2-CO2 combustion. The limit of hydrogen was set and replaced by CO2 with O2:CO2 ratio as 1:3.76, 1:4 and 1:5 for this study. In this study, the combustion of H2-O2 -CO2 on flat burner at equivalence ratio =0.5 was investigated for 10, 15 and 20 L/min of flow rate mixtures. When the ratio of CO2 increases, the power spectral density is lower, the size of attractor and cellular flame become larger because the decrease of hydrogen replaced by CO2 affects the diffusive-thermal instability. Moreover, the flow rate mixtures increases, the power spectral density increases, the size of reconstructed attractor and cell size become smaller due to decreasing of instability. The results show that the variation of CO2 and mixture flow rate affects the instability of cellular premixed flames on flat burner.

Keywords: instability, H2-O2-CO2 combustion, flat burner, diffusive-thermal instability

Procedia PDF Downloads 356
12944 Analysis of Fault Tolerance on Grid Computing in Real Time Approach

Authors: Parampal Kaur, Deepak Aggarwal

Abstract:

In the computational Grid, fault tolerance is an imperative issue to be considered during job scheduling. Due to the widespread use of resources, systems are highly prone to errors and failures. Hence, fault tolerance plays a key role in the grid to avoid the problem of unreliability. Scheduling the task to the appropriate resource is a vital requirement in computational Grid. The fittest resource scheduling algorithm searches for the appropriate resource based on the job requirements, in contrary to the general scheduling algorithms where jobs are scheduled to the resources with best performance factor. The proposed method is to improve the fault tolerance of the fittest resource scheduling algorithm by scheduling the job in coordination with job replication when the resource has low reliability. Based on the reliability index of the resource, the resource is identified as critical. The tasks are scheduled based on the criticality of the resources. Results show that the execution time of the tasks is comparatively reduced with the proposed algorithm using real-time approach rather than a simulator.

Keywords: computational grid, fault tolerance, task replication, job scheduling

Procedia PDF Downloads 430
12943 Labor Legislation and Female Economic Empowerment: Evidence from Night Work, Regulatory and Seating Laws

Authors: Lamis Kattan, Joanne Haddad

Abstract:

This paper examines the impact of gender focused labor legislation on women's labor force participation and economic empowerment. We rely on historical legislative acts passed by state legislatures and exploit whether or not states passed regulatory laws regulating overall and industry specific employment and work conditions for women, night work laws and labor laws requiring provision of seats for working women. We exploit the fact that not all states enacted these laws as well as the variation in the timing of enactment of such laws. Our results show that women in comparison to men in treated states are more likely to be in the labor force post introduction of night work laws in comparison to control states. We also document the effect of industry-specific labor policies on women's likelihood to be employed in the affected industry and in higher-wage occupations within the industry of interest. Policy implications of our findings endorse the adoption of labor laws in favor of women to advocate their empowerment through a higher involvement in the labor market and financial independence.

Keywords: female employment, labor laws, marriage, fertility

Procedia PDF Downloads 91
12942 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support the operational decision of the Emergency Medical Service (EMS) regarding the assignment of medical emergency requests to Emergency Departments (ED). In the literature, this problem is also known as “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies are mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a request. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the transport time and release the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs, considering information relating to the subsequent phases of the process, such as the case-mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to evaluate different hospital selection policies. Therefore, the next steps of the research consisted of the development of a general simulation architecture, its implementation in the AnyLogic software and its validation on a realistic dataset. The hospital selection policy that produced the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, which is based on a retrospective estimate of the TTP, and a dynamic approach, which is based on a predictive estimate of the TTP determined with a constantly updated Winters model. Findings reveal that considering the minimization of TTP as a hospital selection policy raises several benefits. It allows to significantly reduce service throughput times in the ED with a minimum increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case-mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms of TTP estimation than a retrospective approach but entails a more difficult application. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: discrete event simulation, emergency medical services, forecast model, hospital selection

Procedia PDF Downloads 84
12941 Money Laundering Risk Assessment in the Banking Institutions: An Experimental Approach

Authors: Yusarina Mat-Isa, Zuraidah Mohd-Sanusi, Mohd-Nizal Haniff, Paul A. Barnes

Abstract:

In view that money laundering has become eminent for banking institutions, it is an obligation for the banking institutions to adopt a risk-based approach as the integral component of the accepted policies on anti-money laundering. In doing so, those involved with the banking operations are the most critical group of personnel as these are the people who deal with the day-to-day operations of the banking institutions and are obligated to form a judgement on the level of impending risk. This requirement is extended to all relevant banking institutions staff, such as tellers and customer account representatives for them to identify suspicious customers and escalate it to the relevant authorities. Banking institutions staffs, however, face enormous challenges in identifying and distinguishing money launderers from other legitimate customers seeking genuine banking transactions. Banking institutions staffs are mostly educated and trained with the business objective in mind to serve the customers and are not trained to be “detectives with a detective’s power of observation”. Despite increasing awareness as well as trainings conducted for the banking institutions staff, their competency in assessing money laundering risk is still insufficient. Several gaps have prompted this study including the lack of behavioural perspectives in the assessment of money laundering risk in the banking institutions. Utilizing experimental approach, respondents are randomly assigned within a controlled setting with manipulated situations upon which judgement of the respondents is solicited based on various observations related to the situations. The study suggests that it is imperative that informed judgement is exercised in arriving at the decision to proceed with the banking services required by the customers. Judgement forms a basis of opinion for the banking institution staff to decide if the customers posed money laundering risk. Failure to exercise good judgement could results in losses and absorption of unnecessary risk into the banking institutions. Although the banking institutions are exposed with choices of automated solutions in assessing money laundering risk, the human factor in assessing the risk is indispensable. Individual staff in the banking institutions is the first line of defence who are responsible for screening the impending risk of any customer soliciting for banking services. At the end of the spectrum, the individual role involvement on the subject of money laundering risk assessment is not a substitute for automated solutions as human judgement is inimitable.

Keywords: banking institutions, experimental approach, money laundering, risk assessment

Procedia PDF Downloads 259
12940 Parameters Identification and Sensitivity Study for Abrasive WaterJet Milling Model

Authors: Didier Auroux, Vladimir Groza

Abstract:

This work is part of STEEP Marie-Curie ITN project, and it focuses on the identification of unknown parameters of the proposed generic Abrasive WaterJet Milling (AWJM) PDE model, that appears as an ill-posed inverse problem. The necessity of studying this problem comes from the industrial milling applications where the possibility to predict and model the final surface with high accuracy is one of the primary tasks in the absence of any knowledge of the model parameters that should be used. In this framework, we propose the identification of model parameters by minimizing a cost function, measuring the difference between experimental and numerical solutions. The adjoint approach based on corresponding Lagrangian gives the opportunity to find out the unknowns of the AWJM model and their optimal values that could be used to reproduce the required trench profile. Due to the complexity of the nonlinear problem and a large number of model parameters, we use an automatic differentiation software tool (TAPENADE) for the adjoint computations. By adding noise to the artificial data, we show that in fact the parameter identification problem is highly unstable and strictly depends on input measurements. Regularization terms could be effectively used to deal with the presence of data noise and to improve the identification correctness. Based on this approach we present results in 2D and 3D of the identification of the model parameters and of the surface prediction both with self-generated data and measurements obtained from the real production. Considering different types of model and measurement errors allows us to obtain acceptable results for manufacturing and to expect the proper identification of unknowns. This approach also gives us the ability to distribute the research on more complex cases and consider different types of model and measurement errors as well as 3D time-dependent model with variations of the jet feed speed.

Keywords: Abrasive Waterjet Milling, inverse problem, model parameters identification, regularization

Procedia PDF Downloads 311
12939 Disaggregating and Forecasting the Total Energy Consumption of a Building: A Case Study of a High Cooling Demand Facility

Authors: Juliana Barcelos Cordeiro, Khashayar Mahani, Farbod Farzan, Mohsen A. Jafari

Abstract:

Energy disaggregation has been focused by many energy companies since energy efficiency can be achieved when the breakdown of energy consumption is known. Companies have been investing in technologies to come up with software and/or hardware solutions that can provide this type of information to the consumer. On the other hand, not all people can afford to have these technologies. Therefore, in this paper, we present a methodology for breaking down the aggregate consumption and identifying the highdemanding end-uses profiles. These energy profiles will be used to build the forecast model for optimal control purpose. A facility with high cooling load is used as an illustrative case study to demonstrate the results of proposed methodology. We apply a high level energy disaggregation through a pattern recognition approach in order to extract the consumption profile of its rooftop packaged units (RTUs) and present a forecast model for the energy consumption.  

Keywords: energy consumption forecasting, energy efficiency, load disaggregation, pattern recognition approach

Procedia PDF Downloads 270
12938 Using Heat-Mask in the Thermoforming Machine for Component Positioning in Thermoformed Electronics

Authors: Behnam Madadnia

Abstract:

For several years, 3D-shaped electronics have been rising, with many uses in home appliances, automotive, and manufacturing. One of the biggest challenges in the fabrication of 3D shape electronics, which are made by thermoforming, is repeatable and accurate component positioning, and typically there is no control over the final position of the component. This paper aims to address this issue and present a reliable approach for guiding the electronic components in the desired place during thermoforming. We have proposed a heat-control mask in the thermoforming machine to control the heating of the polymer, not allowing specific parts to be formable, which can assure the conductive traces' mechanical stability during thermoforming of the substrate. We have verified our approach's accuracy by applying our method on a real industrial semi-sphere mold for positioning 7 LEDs and one touch sensor. We measured the LEDs' position after thermoforming to prove the process's repeatability. The experiment results demonstrate that the proposed method is capable of positioning electronic components in thermoformed 3D electronics with high precision.

Keywords: 3D-shaped electronics, electronic components, thermoforming, component positioning

Procedia PDF Downloads 87
12937 Flocculation on the Treatment of Olive Oil Mill Wastewater: Pre-Treatment

Authors: G. Hodaifa, J. A. Páez, C. Agabo, E. Ramos, J. C. Gutiérrez, A. Rosal

Abstract:

Currently, the continuous two-phase decanter process used for olive oil production is the more internationally widespread. The wastewaters generated from this industry (OMW) is a real environmental problem because of its high organic load. Among proposed treatments for these wastewaters, the advanced oxidation technologies (Fenton process, ozone, photoFenton, etc.) are the most favourable. The direct application of these processes is somewhat expensive. Therefore, the application of a previous stage based on a flocculation-sedimentation operation is of high importance. In this research five commercial flocculants (three cationic, and two anionic) have been used to achieve the separation of phases (liquid clarified-sludge). For each flocculant, different concentrations (0-1000 mg/L) have been studied. In these experiments, sludge volume formed over time and the final water quality were determined. The final removal percentages of total phenols (11.3-25.1%), COD (5.6-20.4%), total carbon (2.3-26.5%), total organic carbon (1.50-23.8%), total nitrogen (1.45-24.8%), and turbidity (27.9-61.4%) were obtained. Also, the variation on the electric conductivity reduction percentage (1-8%) was determined. Finally, the best flocculants with highest removal percentages have been determined (QG2001 and Flocudex CS49).

Keywords: flocculants, flocculation, olive oil mill wastewater, water quality

Procedia PDF Downloads 537
12936 Diversity and Ecology of the Aquatic Avifauna of the Wetland of Sebkhet Bazer Sakhra, South of Setif, Algeria

Authors: Gouga Hadjer, Djerdali Sofia, Benssaci Ettayeb

Abstract:

In order to estimate the evolution of the numbers of the aquatic avifauna and their seasonal variations in Sebkhet of Bazer-Sakhra (Site of the eco-complex wetlands of Setif) a monitoring realized during the period from September 2012 to August 2013 allowed to inventory 54 species are spread over 08 orders, 15 families, 34 genres. To follow the global dynamics and the seasonal distribution of species inventoried at Sebkhet Bazer, an analysis of the variation of the total workforce has been established by ecological indices. The autumn season includes the largest number of birds, it totals 3639 individuals. Accidental species are well represented at the autumn and spring seasons denote the interest of the site with respect to migration passages of aquatic birds. During the fall and spring, the Flamingo and the Belon Shelduck are the most abundant with respectively (500, 883) and (560, 1296) individuals. The ecological analysis of this stand showed us that the highest species richness is recorded in spring, (45 species) and the lowest value is obtained in summer it is 20 species.

Keywords: Sebkhet of BazerSakra, ecology, aquatic avifauna, biodiversity, seasonal evolution, wetland

Procedia PDF Downloads 317
12935 A Hybrid Pareto-Based Swarm Optimization Algorithm for the Multi-Objective Flexible Job Shop Scheduling Problems

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a new hybrid particle swarm optimization algorithm is proposed for the multi-objective flexible job shop scheduling problem that is very important and hard combinatorial problem. The Pareto approach is used for solving the multi-objective problem. Several new local search heuristics are integrated into an algorithm based on the critical block concept to enhance the performance of the algorithm. The algorithm is compared with the recently published multi-objective algorithms based on benchmarks selected from the literature. Several metrics are used for quantifying performance and comparison of the achieved solutions. The algorithms are also compared based on the Weighting summation of objectives approach. The proposed algorithm can find the Pareto solutions more efficiently than the compared algorithms in less computational time.

Keywords: swarm-based optimization, local search, Pareto optimality, flexible job shop scheduling, multi-objective optimization

Procedia PDF Downloads 363
12934 A Formal Property Verification for Aspect-Oriented Programs in Software Development

Authors: Moustapha Bande, Hakima Ould-Slimane, Hanifa Boucheneb

Abstract:

Software development for complex systems requires efficient and automatic tools that can be used to verify the satisfiability of some critical properties such as security ones. With the emergence of Aspect-Oriented Programming (AOP), considerable work has been done in order to better modularize the separation of concerns in the software design and implementation. The goal is to prevent the cross-cutting concerns to be scattered across the multiple modules of the program and tangled with other modules. One of the key challenges in the aspect-oriented programs is to be sure that all the pieces put together at the weaving time ensure the satisfiability of the overall system requirements. Our paper focuses on this problem and proposes a formal property verification approach for a given property from the woven program. The approach is based on the control flow graph (CFG) of the woven program, and the use of a satisfiability modulo theories (SMT) solver to check whether each property (represented par one aspect) is satisfied or not once the weaving is done.

Keywords: aspect-oriented programming, control flow graph, property verification, satisfiability modulo theories

Procedia PDF Downloads 170
12933 The Differentiation of Performances among Immigrant Entrepreneurs: A Biographical Approach

Authors: Daniela Gnarini

Abstract:

This paper aims to contribute to the field of immigrants' entrepreneurial performance. The debate on immigrant entrepreneurship has been dominated by cultural explanations, which argue that immigrants’ entrepreneurial results are linked to groups’ characteristics. However, this approach does not consider important dimensions that influence entrepreneurial performances. Furthermore, cultural theories do not take into account the huge differences in performances also within the same ethnic group. For these reason, this study adopts a biographical approach, both at theoretical and at methodological level, which can allow to understand the main aspects that make the difference in immigrants' entrepreneurial performances, by exploring the narratives of immigrant entrepreneurs, who operate in the restaurant sector in two different Italian metropolitan areas: Milan and Rome. Through the qualitative method of biographical interviews, this study analyses four main dimensions and their combinations: a) individuals' entrepreneurial and migratory path: this aspect is particularly relevant to understand the biographical resources of immigrant entrepreneurs and their change and evolution during time; b) entrepreneurs' social capital, with a particular focus on their networks, through the adoption of a transnational perspective, that takes into account both the local level and the transnational connections. This study highlights that, though entrepreneurs’ connections are significant, especially as far as those with family members are concerned, often their entrepreneurial path assumes an individualised trajectory. c) Entrepreneurs' human capital, including both formal education and skills acquired through informal channels. The latter are particularly relevant since in the interviews and data collected the role of informal transmission emerges. d) Embeddedness within the social, political and economic context, to understand the main constraints and opportunities both at local and national level. The comparison between two different metropolitan areas within the same country helps to understand this dimension.

Keywords: biographies, immigrant entrepreneurs, life stories, performance

Procedia PDF Downloads 220
12932 Using a Quantitative Reasoning Framework to Help Students Understand Arc Measure Relationships

Authors: David Glassmeyer

Abstract:

Quantitative reasoning is necessary to robustly understand mathematical concepts ranging from elementary to university levels. Quantitative reasoning involves identifying and representing quantities and the relationships between these quantities. Without reasoning quantitatively, students often resort to memorizing formulas and procedures, which have negative impacts when they encounter mathematical topics in the future. This study investigated how high school students’ quantitative reasoning could be fostered within a unit on arc measure and angle relationships. Arc measure, or the measure of a central angle that cuts off a portion of a circle’s circumference, is often confused with arclength. In this study, the researcher redesigned an activity to clearly distinguish arc measure and arc length by using a quantitative reasoning framework. Data were collected from high school students to determine if this approach impacted their understanding of these concepts. Initial data indicates the approach was successful in supporting students’ quantitative reasoning of these topics. Implications for the work are that teachers themselves may also benefit from considering mathematical definitions from a quantitative reasoning framework and can use this activity in their own classrooms.

Keywords: arc length, arc measure, quantitative reasoning, student content knowledge

Procedia PDF Downloads 249
12931 Topology Optimization of Heat and Mass Transfer for Two Fluids under Steady State Laminar Regime: Application on Heat Exchangers

Authors: Rony Tawk, Boutros Ghannam, Maroun Nemer

Abstract:

Topology optimization technique presents a potential tool for the design and optimization of structures involved in mass and heat transfer. The method starts with an initial intermediate domain and should be able to progressively distribute the solid and the two fluids exchanging heat. The multi-objective function of the problem takes into account minimization of total pressure loss and maximization of heat transfer between solid and fluid subdomains. Existing methods account for the presence of only one fluid, while the actual work extends optimization distribution of solid and two different fluids. This requires to separate the channels of both fluids and to ensure a minimum solid thickness between them. This is done by adding a third objective function to the multi-objective optimization problem. This article uses density approach where each cell holds two local design parameters ranging from 0 to 1, where the combination of their extremums defines the presence of solid, cold fluid or hot fluid in this cell. Finite volume method is used for direct solver coupled with a discrete adjoint approach for sensitivity analysis and method of moving asymptotes for numerical optimization. Several examples are presented to show the ability of the method to find a trade-off between minimization of power dissipation and maximization of heat transfer while ensuring the separation and continuity of the channel of each fluid without crossing or mixing the fluids. The main conclusion is the possibility to find an optimal bi-fluid domain using topology optimization, defining a fluid to fluid heat exchanger device.

Keywords: topology optimization, density approach, bi-fluid domain, laminar steady state regime, fluid-to-fluid heat exchanger

Procedia PDF Downloads 393
12930 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow

Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat

Abstract:

Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.

Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement

Procedia PDF Downloads 83
12929 A Global Fuel Combustion Data Product and Its Application

Authors: Shu Tao, Rong Wang, Huizhong Shen, Ye Huang

Abstract:

High-resolution mapping of fuel combustion is essential for reducing uncertainties in assessments of greenhouse gases and air pollutant emissions. Such inventories provide valuable information for inferring carbon sinks, modeling pollutant transport, and developing control strategies. Previous inventories included only a few fuel types and were derived using national population proxies which may distort the geographical variation within countries. In this study, a global 0.1 degree by 0.1 degree geo-referenced inventory of fuel combustion (PKU-FUEL-2007) was developed for 64 fuel sub-types along with uncertainty analysis for the year 2007. Sub-national fuel consumption of large countries and major power-station locations were used. The disaggregation error can be reduced significantly by using the sub-nationally energy data, because the uneven distribution of per-capita fuel consumption within countries is taken into consideration. The PKU-FUEL was used to generate global emission inventories of CO2 (PKU-CO2-2007), polycyclic aromatic hydrocarbons (PKU-PAHs-2007), and black carbons (PKU-BC-2007). Atmospheric transport modeling and expsoure assessment were conducted for BC and PAHs based on the inventory.

Keywords: fuel, emission, BC, PAHs, atmospheric transport, exposure

Procedia PDF Downloads 323
12928 Scalable and Accurate Detection of Pathogens from Whole-Genome Shotgun Sequencing

Authors: Janos Juhasz, Sandor Pongor, Balazs Ligeti

Abstract:

Next-generation sequencing, especially whole genome shotgun sequencing, is becoming a common approach to gain insight into the microbiomes in a culture-independent way, even in clinical practice. It does not only give us information about the species composition of an environmental sample but opens the possibility to detect antimicrobial resistance and novel, or currently unknown, pathogens. Accurately and reliably detecting the microbial strains is a challenging task. Here we present a sensitive approach for detecting pathogens in metagenomics samples with special regard to detecting novel variants of known pathogens. We have developed a pipeline that uses fast, short read aligner programs (i.e., Bowtie2/BWA) and comprehensive nucleotide databases. Taxonomic binning is based on the lowest common ancestor (LCA) principle; each read is assigned to a taxon, covering the most significantly hit taxa. This approach helps in balancing between sensitivity and running time. The program was tested both on experimental and synthetic data. The results implicate that our method performs as good as the state-of-the-art BLAST-based ones, furthermore, in some cases, it even proves to be better, while running two orders magnitude faster. It is sensitive and capable of identifying taxa being present only in small abundance. Moreover, it needs two orders of magnitude less reads to complete the identification than MetaPhLan2 does. We analyzed an experimental anthrax dataset (B. anthracis strain BA104). The majority of the reads (96.50%) was classified as Bacillus anthracis, a small portion, 1.2%, was classified as other species from the Bacillus genus. We demonstrate that the evaluation of high-throughput sequencing data is feasible in a reasonable time with good classification accuracy.

Keywords: metagenomics, taxonomy binning, pathogens, microbiome, B. anthracis

Procedia PDF Downloads 130
12927 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data

Authors: M. Mueller, M. Kuehn, M. Voelker

Abstract:

In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).

Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing

Procedia PDF Downloads 123
12926 Natural Frequency Analysis of a Porous Functionally Graded Shaft System

Authors: Natural Frequency Analysis of a Porous Functionally Graded Shaft System

Abstract:

The vibration characteristics of a functionally graded (FG) rotor model having porosities and micro-voids is investigated using three-dimensional finite element analysis. The FG shaft is mounted with a steel disc located at the midspan. The shaft ends are supported on isotropic bearings. The FG material is composed of a metallic (stainless-steel) and ceramic phase (zirconium oxide) as its constituent phases. The layer wise material property variation is governed by power law. Material property equations are developed for the porosity modelling. Python code is developed to assign the material properties to each layer including the effect of porosities. ANSYS commercial software is used to extract the natural frequencies and whirl frequencies for the FG shaft system. The obtained results show the influence of porosity volume fraction and power-law index, on the vibration characteristics of the ceramic-based FG shaft system.

Keywords: Finite element method, Functionally graded material, Porosity volume fraction, Power law

Procedia PDF Downloads 195
12925 Numerical and Comparative Analysis between Two Composite Plates Notched in Different Shapes and Repaired by Composite

Authors: Amari Khaoula, Berrahou Mohamed

Abstract:

The topic of our article revolves around a numerical and comparative analysis between two notched Boron/epoxy plates that are U-shaped and the other V-shaped, cracked, and repaired by a rectangular patch of the same composite material; the finite element method was used for the analytical study and comparison of the results obtained for determining the optimal shape of notch which will give a longer life to the repair. In this context, we studied the variation of the stress intensity factor, the evolution of the damaged area, and the calculation of the ratio of the damaged area according to the crack length and the concentration of the Von Mises stresses as a function of the lengths of the paths. According to the results obtained, we conclude that the notch plate U is the optimal one than notch plate V because it has lower values either for the stress intensity factor (SIF), damaged area ratio (Dᵣ), or the Von Mises stresses.

Keywords: the notch U, the notch V, the finite element method FEM, comparison, rectangular patch, composite, stress intensity factor, damaged area ratio, Von Mises stresses

Procedia PDF Downloads 93
12924 Numerical Simulation of Multiple Arrays Arrangement of Micro Hydro Power Turbines

Authors: M. A. At-Tasneem, N. T. Rao, T. M. Y. S. Tuan Ya, M. S. Idris, M. Ammar

Abstract:

River flow over micro hydro power (MHP) turbines of multiple arrays arrangement is simulated with computational fluid dynamics (CFD) software to obtain the flow characteristics. In this paper, CFD software is used to simulate the water flow over MHP turbines as they are placed in a river. Multiple arrays arrangement of MHP turbines lead to generate large amount of power. In this study, a river model is created and simulated in CFD software to obtain the water flow characteristic. The process then continued by simulating different types of arrays arrangement in the river model. A MHP turbine model consists of a turbine outer body and static propeller blade in it. Five types of arrangements are used which are parallel, series, triangular, square and rhombus with different spacing sizes. The velocity profiles on each MHP turbines are identified at the mouth of each turbine bodies. This study is required to obtain the arrangement with increasing spacing sizes that can produce highest power density through the water flow variation.

Keywords: micro hydro power, CFD, arrays arrangement, spacing sizes, velocity profile, power

Procedia PDF Downloads 355
12923 The Approach of Male and Female Spectators about the Presence of Female Spectators in Sport Stadiums of Iran

Authors: Mohammad Reza Boroumand Devlagh, Seyed Mohammad Hosein Razavi, Fatemeh Ahmadi, Azam Fazli Darzi

Abstract:

The issue of female presence in Iran stadiums has long been considered and debated by governmental experts and authorities, however, no conclusion is yielded yet. Thus, the present study has been done with the aim of investigating the approach of male and female spectators about the presence of female spectators in Iranian stadiums. The statistical population of the study includes all male and female spectators who have not experienced the live watching of male championship matches in stadiums. 224 subjects from the statistical population have selected through stratified random sampling as the sample of the study. For data collection, researcher-made questionnaire has been used whose validity has been confirmed by the university professors and its reliability has been studied and confirmed through an preliminary study. (r= 0.81). Data analysis has been done using descriptive and referential statistics in P< 0.05. The results of the study showed that male and female were meaningfully agreed with the female presence in stadiums and there is no meaningful difference between male and female approaches concerning the female spectators’ presence in sport stadiums of Iran (sig= 0.867).

Keywords: male, female spectators, Iran, sport stadiums, population

Procedia PDF Downloads 545
12922 Experimental Study of Local Scour Downstream of Cylindrical Bridge Piers

Authors: Mohammed Traeq Shukri

Abstract:

Scour is a natural phenomenon caused by the erosive action of flowing stream on alluvial beds, which removes the sediment around or near structures located in flowing water. It means the lowering of the riverbed level by water erosions such that there is a tendency to expose the foundations of a structure. It is the result of the erosive action of flowing water, excavating and carrying away material from the bed and banks of streams and from around the piers of bridges. The failure of bridges due to excessive local scour during floods poses a challenging problem to hydraulic engineers. The failure of bridges piers is due to many reasons such as localized scour combined with general riverbed degradation. In this paper, we try to estimate the temporal variation of scour depth at non-uniform cylindrical bridge pier, by experimental work in civil engineering hydraulic laboratories of Gaziantep University on a channel have dimensions of 8.3m length, 0.8m width and 0.9m depth. The experiments will be carried on 20 cm depth of sediment layer having d50=0.4 mm. Three bridge pier shapes having different scaled models will be constructed in a 1.5m of test section in the channel.

Keywords: scour, local scour, bridge piers, scour depth, vortex, horseshoe vortex

Procedia PDF Downloads 160
12921 Construction of Wind Tunnel for Aerodynamic

Authors: Elmo Thiago Lins Cöuras Ford, Valentina Alessandra Carvalho do Vale, José Ubiragi de Lima Mendes

Abstract:

The study of the aerodynamics is related to the improvement in the acting of airplanes and automobiles with the objective of being reduced the effect of the attrition of the air on structures, providing larger speeds and smaller consumption of fuel. The application of the knowledge of the aerodynamics not more limits to the aeronautical and automobile industries. In that way, being tried the new demands with relationship to the aerodynamic study in the most several areas of the engineering, this work presents the stages of the project and construction of a wind tunnel for application in aerodynamic rehearsals. Among the several configurations of existent wind tunnels, opted to build open circuit, due to smaller construction complexity and installation; operational simplicity and cost reduced. Belonging to the type blower, to take advantage of a larger efficiency of the motor; and with diffusion so that flowed him of air it wins speed before reaching the section of rehearsals. The guidelines for project were: didactic practices: study of the layer it limits and analyze of the drainages on proof bodies with different geometries. For the pressure variation in the test section a connected manometer used a pitot tube. Quantitative and qualitative results showed to be satisfactory.

Keywords: wind tunnel, aerodynamics, air, airplane

Procedia PDF Downloads 481
12920 Enhancing Rural Agricultural Value Chains through Electric Mobility Services in Ethiopia

Authors: Clemens Pizzinini, Philipp Rosner, David Ziegler, Markus Lienkamp

Abstract:

Transportation is a constitutional part of most supply and value chains in modern economies. Smallholder farmers in rural Ethiopia face severe challenges along their supply and value chains. In particular, suitable, affordable, and available transport services are in high demand. To develop a context-specific technical solutions, a problem-to-solution methodology based on the interaction with technology is developed. With this approach, we fill the gap between proven transportation assessment frameworks and general user-centered techniques. Central to our approach is an electric test vehicle that is implemented in rural supply and value chains for research, development, and testing. Based on our objective and the derived methodological requirements, a set of existing methods is selected. Local partners are integrated into an organizational framework that executes major parts of this research endeavour in the Arsi Zone, Oromia Region, Ethiopia.

Keywords: agricultural value chain, participatory methods, agile methods, sub-Saharan Africa, Ethiopia, electric vehicle, transport service

Procedia PDF Downloads 64
12919 Syntactic Ambiguity and Syntactic Analysis: Transformational Grammar Approach

Authors: Olufemi Olupe

Abstract:

Within linguistics, various approaches have been adopted to the study of language. One of such approaches is the syntax. The syntax is an aspect of the grammar of the language which deals with how words are put together to form phrases and sentences and how such structures are interpreted in language. Ambiguity, which is also germane in this discourse is about the uncertainty of meaning as a result of the possibility of a phrase or sentence being understood and interpreted in more than one way. In the light of the above, this paper attempts a syntactic study of syntactic ambiguities in The English Language, using the Transformational Generative Grammar (TGG) Approach. In doing this, phrases and sentences were raised with each description followed by relevant analysis. Finding in the work reveals that ambiguity cannot always be disambiguated by the means of syntactic analysis alone without recourse to semantic interpretation. The further finding shows that some syntactical ambiguities structures cannot be analysed on two surface structures in spite of the fact that there are more than one deep structures. The paper concludes that in as much as ambiguity remains in language; it will continue to pose a problem of understanding to a second language learner. Users of English as a second language, must, however, make a conscious effort to avoid its usage to achieve effective communication.

Keywords: language, syntax, semantics, morphology, ambiguity

Procedia PDF Downloads 385
12918 British English vs. American English: A Comparative Study

Authors: Halima Benazzouz

Abstract:

It is often believed that British English and American English are the foremost varieties of the English Language serving as reference norms for other varieties;that is the reason why they have obviously been compared and contrasted.Meanwhile,the terms “British English” and “American English” are used differently by different people to refer to: 1) Two national varieties each subsuming regional and other sub-varieties standard and non-standard. 2) Two national standard varieties in which each one is only part of the range of English within its own state, but the most prestigious part. 3) Two international varieties, that is each is more than a national variety of the English Language. 4) Two international standard varieties that may or may not each subsume other standard varieties.Furthermore,each variety serves as a reference norm for users of the language elsewhere. Moreover, without a clear identification, as primarily belonging to one variety or the other, British English(Br.Eng) and American English (Am.Eng) are understood as national or international varieties. British English and American English are both “variants” and “varieties” of the English Language, more similar than different.In brief, the following may justify general categories of difference between Standard American English (S.Am.E) and Standard British English (S.Br.e) each having their own sociolectic value: A difference in pronunciation exists between the two foremost varieties, although it is the same spelling, by contrast, a divergence in spelling may be recognized, eventhough the same pronunciation. In such case, the same term is different but there is a similarity in spelling and pronunciation. Otherwise, grammar, syntax, and punctuation are distinctively used to distinguish the two varieties of the English Language. Beyond these differences, spelling is noted as one of the chief sources of variation.

Keywords: Greek, Latin, French pronunciation expert, varieties of English language

Procedia PDF Downloads 493
12917 Political Economy and Human Rights Engaging in Conversation

Authors: Manuel Branco

Abstract:

This paper argues that mainstream economics is one of the reasons that can explain the difficulty in fully realizing human rights because its logic is intrinsically contradictory to human rights, most especially economic, social and cultural rights. First, its utilitarianism, both in its cardinal and ordinal understanding, contradicts human rights principles. Maximizing aggregate utility along the lines of cardinal utility is a theoretical exercise that consists in ensuring as much as possible that gains outweigh losses in society. In this process an individual may get worse off, though. If mainstream logic is comfortable with this, human rights' logic does not. Indeed, universality is a key principle in human rights and for this reason the maximization exercise should aim at satisfying all citizens’ requests when goods and services necessary to secure human rights are at stake. The ordinal version of utilitarianism, in turn, contradicts the human rights principle of indivisibility. Contrary to ordinal utility theory that ranks baskets of goods, human rights do not accept ranking when these goods and services are necessary to secure human rights. Second, by relying preferably on market logic to allocate goods and services, mainstream economics contradicts human rights because the intermediation of money prices and the purpose of profit may cause exclusion, thus compromising the principle of universality. Finally, mainstream economics sees human rights mainly as constraints to the development of its logic. According to this view securing human rights would, then, be considered a cost weighing on economic efficiency and, therefore, something to be minimized. Fully realizing human rights needs, therefore, a different approach. This paper discusses a human rights-based political economy. This political economy, among other characteristics should give up mainstream economics narrow utilitarian approach, give up its belief that market logic should guide all exchanges of goods and services between human beings, and finally give up its view of human rights as constraints on rational choice and consequently on good economic performance. Giving up mainstream’s narrow utilitarian approach means, first embracing procedural utility and human rights-aimed consequentialism. Second, a more radical break can be imagined; non-utilitarian, or even anti-utilitarian, approaches may emerge, then, as alternatives, these two standpoints being not necessarily mutually exclusive, though. Giving up market exclusivity means embracing decommodification. More specifically, this means an approach that takes into consideration the value produced outside the market and an allocation process no longer necessarily centered on money prices. Giving up the view of human rights as constraints means, finally, to consider human rights as an expression of wellbeing and a manifestation of choice. This means, in turn, an approach that uses indicators of economic performance other than growth at the macro level and profit at the micro level, because what we measure affects what we do.

Keywords: economic and social rights, political economy, economic theory, markets

Procedia PDF Downloads 146
12916 Genetic and Environmental Variation in Reproductive and Lactational Performance of Holstein Cattle

Authors: Ashraf Ward

Abstract:

Effect of calving interval on 305 day milk yield for first three lactations was studied in order to increase efficiency of selection schemes and to more efficiently manage Holstein cows that have been raised on small farms in Libya. Results obtained by processing data of 1476 cows, managed in 935 small scale farms, pointed out that current calving interval significantly affects on milk production for first three lactations (p<0.05). Preceding calving interval affected 305 day milk yield (p<0.05) in second lactation only. Linear regression model accounted for 20-25 % of the total variance of 305 day milk yield. Extension of calving interval over 420, 430, 450 days for first, second and third lactations respectively, did not increase milk production when converted to 305 day lactation. Stochastic relations between calving interval and calving age and month are moderated. Values of Pierson’s correlation coefficients ranged 0.38 to 0.69. Adjustment of milk production in order to reduce effect of calving interval on total phenotypic variance of milk yield is valid for first lactation only. Adjustment of 305 day milk yield for second and third lactations in order to reduce effects of factors “calving age and month” brings about, at the same time, elimination of calving interval effect.

Keywords: milk yield, Holstien, non genetic, calving

Procedia PDF Downloads 415