Search results for: Fisher criterion
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 773

Search results for: Fisher criterion

503 Recognition of Grocery Products in Images Captured by Cellular Phones

Authors: Farshideh Einsele, Hassan Foroosh

Abstract:

In this paper, we present a robust algorithm to recognize extracted text from grocery product images captured by mobile phone cameras. Recognition of such text is challenging since text in grocery product images varies in its size, orientation, style, illumination, and can suffer from perspective distortion. Pre-processing is performed to make the characters scale and rotation invariant. Since text degradations can not be appropriately defined using wellknown geometric transformations such as translation, rotation, affine transformation and shearing, we use the whole character black pixels as our feature vector. Classification is performed with minimum distance classifier using the maximum likelihood criterion, which delivers very promising Character Recognition Rate (CRR) of 89%. We achieve considerably higher Word Recognition Rate (WRR) of 99% when using lower level linguistic knowledge about product words during the recognition process.

Keywords: camera-based OCR, feature extraction, document, image processing, grocery products

Procedia PDF Downloads 393
502 Parallel Genetic Algorithms Clustering for Handling Recruitment Problem

Authors: Walid Moudani, Ahmad Shahin

Abstract:

This research presents a study to handle the recruitment services system. It aims to enhance a business intelligence system by embedding data mining in its core engine and to facilitate the link between job searchers and recruiters companies. The purpose of this study is to present an intelligent management system for supporting recruitment services based on data mining methods. It consists to apply segmentation on the extracted job postings offered by the different recruiters. The details of the job postings are associated to a set of relevant features that are extracted from the web and which are based on critical criterion in order to define consistent clusters. Thereafter, we assign the job searchers to the best cluster while providing a ranking according to the job postings of the selected cluster. The performance of the proposed model used is analyzed, based on a real case study, with the clustered job postings dataset and classified job searchers dataset by using some metrics.

Keywords: job postings, job searchers, clustering, genetic algorithms, business intelligence

Procedia PDF Downloads 313
501 Application of the Standard Deviation in Regulating Design Variation of Urban Solutions Generated through Evolutionary Computation

Authors: Mohammed Makki, Milad Showkatbakhsh, Aiman Tabony

Abstract:

Computational applications of natural evolutionary processes as problem-solving tools have been well established since the mid-20th century. However, their application within architecture and design has only gained ground in recent years, with an increasing number of academics and professionals in the field electing to utilize evolutionary computation to address problems comprised from multiple conflicting objectives with no clear optimal solution. Recent advances in computer science and its consequent constructive influence on the architectural discourse has led to the emergence of multiple algorithmic processes capable of simulating the evolutionary process in nature within an efficient timescale. Many of the developed processes of generating a population of candidate solutions to a design problem through an evolutionary based stochastic search process are often driven through the application of both environmental and architectural parameters. These methods allow for conflicting objectives to be simultaneously, independently, and objectively optimized. This is an essential approach in design problems with a final product that must address the demand of a multitude of individuals with various requirements. However, one of the main challenges encountered through the application of an evolutionary process as a design tool is the ability for the simulation to maintain variation amongst design solutions in the population while simultaneously increasing in fitness. This is most commonly known as the ‘golden rule’ of balancing exploration and exploitation over time; the difficulty of achieving this balance in the simulation is due to the tendency of either variation or optimization being favored as the simulation progresses. In such cases, the generated population of candidate solutions has either optimized very early in the simulation, or has continued to maintain high levels of variation to which an optimal set could not be discerned; thus, providing the user with a solution set that has not evolved efficiently to the objectives outlined in the problem at hand. As such, the experiments presented in this paper seek to achieve the ‘golden rule’ by incorporating a mathematical fitness criterion for the development of an urban tissue comprised from the superblock as its primary architectural element. The mathematical value investigated in the experiments is the standard deviation factor. Traditionally, the standard deviation factor has been used as an analytical value rather than a generative one, conventionally used to measure the distribution of variation within a population by calculating the degree by which the majority of the population deviates from the mean. A higher standard deviation value delineates a higher number of the population is clustered around the mean and thus limited variation within the population, while a lower standard deviation value is due to greater variation within the population and a lack of convergence towards an optimal solution. The results presented will aim to clarify the extent to which the utilization of the standard deviation factor as a fitness criterion can be advantageous to generating fitter individuals in a more efficient timeframe when compared to conventional simulations that only incorporate architectural and environmental parameters.

Keywords: architecture, computation, evolution, standard deviation, urban

Procedia PDF Downloads 118
500 The Roles of Pay Satisfaction and Intent to Leave on Counterproductive Work Behavior among Non-Academic University Employees

Authors: Abiodun Musbau Lawal, Sunday Samson Babalola, Uzor Friday Ordu

Abstract:

Issue of employees counterproductive work behavior in government owned organization in emerging economies has continued to be a major concern. This study investigated the factors of pay satisfaction, intent to leave and age as predictors of counterproductive work behavior among non-academic employee in a Nigerian federal government owned university. A sample of 200 non-academic employees completed questionnaires. Hierarchical multiple regression was conducted to determine the contribution of each of the predictor variables on the criterion variable on counterproductive work behavior. Results indicate that age of participants (β = -.18; p < .05) significantly independently predicted CWB by accounting for 3% of the explained variance. Addition of pay satisfaction (β = -.14; p < .05) significantly accounted for 5% of the explained variance, while intent to leave (β = -.17; p < .05) further resulted in 8% of the explained variance in counterproductive work behavior. The importance of these findings with regards to reduction in counterproductive work behavior is highlighted.

Keywords: counterproductive, work behaviour, pay satisfaction, intent to leave

Procedia PDF Downloads 359
499 Deep Reinforcement Learning with Leonard-Ornstein Processes Based Recommender System

Authors: Khalil Bachiri, Ali Yahyaouy, Nicoleta Rogovschi

Abstract:

Improved user experience is a goal of contemporary recommender systems. Recommender systems are starting to incorporate reinforcement learning since it easily satisfies this goal of increasing a user’s reward every session. In this paper, we examine the most effective Reinforcement Learning agent tactics on the Movielens (1M) dataset, balancing precision and a variety of recommendations. The absence of variability in final predictions makes simplistic techniques, although able to optimize ranking quality criteria, worthless for consumers of the recommendation system. Utilizing the stochasticity of Leonard-Ornstein processes, our suggested strategy encourages the agent to investigate its surroundings. Research demonstrates that raising the NDCG (Discounted Cumulative Gain) and HR (HitRate) criterion without lowering the Ornstein-Uhlenbeck process drift coefficient enhances the diversity of suggestions.

Keywords: recommender systems, reinforcement learning, deep learning, DDPG, Leonard-Ornstein process

Procedia PDF Downloads 121
498 Free Vibration Analysis of Pinned-Pinned and Clamped-Clamped Equal Strength Columns under Self-Weight and Tip Force Using Differential Quadrature Method

Authors: F. Waffo Tchuimmo, G. S. Kwandio Dongoua, C. U. Yves Mbono Samba, O. Dafounansou, L. Nana

Abstract:

The strength criterion is an important condition of great interest to guarantee the stability of the structural elements. The present work is based on the study of the free vibration of Euler’s Bernoulli column of equal strength in compression while considering its own weight and the axial load in compression and tension subjected to symmetrical boundary conditions. We use the differential quadrature method to investigate the first fifth naturals frequencies parameters of the column according to the different forms of geometrical sections. The results of this work give help in making a judicious choice of type of cross-section and a better boundary condition to guarantee good stability of this type of column in civil constructions.

Keywords: free vibration, equal strength, self-weight, tip force, differential quadrature method

Procedia PDF Downloads 105
497 Self-Organizing Maps for Exploration of Partially Observed Data and Imputation of Missing Values in the Context of the Manufacture of Aircraft Engines

Authors: Sara Rejeb, Catherine Duveau, Tabea Rebafka

Abstract:

To monitor the production process of turbofan aircraft engines, multiple measurements of various geometrical parameters are systematically recorded on manufactured parts. Engine parts are subject to extremely high standards as they can impact the performance of the engine. Therefore, it is essential to analyze these databases to better understand the influence of the different parameters on the engine's performance. Self-organizing maps are unsupervised neural networks which achieve two tasks simultaneously: they visualize high-dimensional data by projection onto a 2-dimensional map and provide clustering of the data. This technique has become very popular for data exploration since it provides easily interpretable results and a meaningful global view of the data. As such, self-organizing maps are usually applied to aircraft engine condition monitoring. As databases in this field are huge and complex, they naturally contain multiple missing entries for various reasons. The classical Kohonen algorithm to compute self-organizing maps is conceived for complete data only. A naive approach to deal with partially observed data consists in deleting items or variables with missing entries. However, this requires a sufficient number of complete individuals to be fairly representative of the population; otherwise, deletion leads to a considerable loss of information. Moreover, deletion can also induce bias in the analysis results. Alternatively, one can first apply a common imputation method to create a complete dataset and then apply the Kohonen algorithm. However, the choice of the imputation method may have a strong impact on the resulting self-organizing map. Our approach is to address simultaneously the two problems of computing a self-organizing map and imputing missing values, as these tasks are not independent. In this work, we propose an extension of self-organizing maps for partially observed data, referred to as missSOM. First, we introduce a criterion to be optimized, that aims at defining simultaneously the best self-organizing map and the best imputations for the missing entries. As such, missSOM is also an imputation method for missing values. To minimize the criterion, we propose an iterative algorithm that alternates the learning of a self-organizing map and the imputation of missing values. Moreover, we develop an accelerated version of the algorithm by entwining the iterations of the Kohonen algorithm with the updates of the imputed values. This method is efficiently implemented in R and will soon be released on CRAN. Compared to the standard Kohonen algorithm, it does not come with any additional cost in terms of computing time. Numerical experiments illustrate that missSOM performs well in terms of both clustering and imputation compared to the state of the art. In particular, it turns out that missSOM is robust to the missingness mechanism, which is in contrast to many imputation methods that are appropriate for only a single mechanism. This is an important property of missSOM as, in practice, the missingness mechanism is often unknown. An application to measurements on one type of part is also provided and shows the practical interest of missSOM.

Keywords: imputation method of missing data, partially observed data, robustness to missingness mechanism, self-organizing maps

Procedia PDF Downloads 133
496 Investigation of Static Stability of Soil Slopes Using Numerical Modeling

Authors: Seyed Abolhasan Naeini, Elham Ghanbari Alamooti

Abstract:

Static stability of soil slopes using numerical simulation by a finite element code, ABAQUS, has been investigated, and safety factors of the slopes achieved in the case of static load of a 10-storey building. The embankments have the same soil condition but different loading distance from the slope heel. The numerical method for estimating safety factors is 'Strength Reduction Method' (SRM). Mohr-Coulomb criterion used in the numerical simulations. Two steps used for measuring the safety factors of the slopes: first is under gravity loading, and the second is under static loading of a building near the slope heel. These safety factors measured from SRM, are compared with the values from Limit Equilibrium Method, LEM. Results show that there is good agreement between SRM and LEM. Also, it is seen that by increasing the distance from slope heel, safety factors increases.

Keywords: limit equilibrium method, static stability, soil slopes, strength reduction method

Procedia PDF Downloads 142
495 Review and Evaluation of Viscose Damper on Structural Responses

Authors: Ehsan Sadie

Abstract:

Developments in the field of damping technology and advances in the area of dampers in equipping many structures have been the result of efforts and testing by researchers in this field. In this paper, a sample of a two-story building is simulated with the help of SAP2000 software, and the effect of a viscous damper on the performance of the structure is explained. The effect of dampers on the response of the structure is investigated. This response involves the horizontal displacement of floors. In this case, the structure is modeled once without a damper and again with a damper. In this regard, the results are presented in the form of tables and graphs. Since the seismic behavior of the structure is studied, the responses show the appropriate effect of viscous dampers in reducing the displacement of floors, and also the energy dissipation in the structure with dampers compared to structures without dampers is significant. Therefore, it is economical to use viscous dampers in areas that have a higher relative earthquake risk.

Keywords: bending frame, displacement criterion, dynamic response spectra, earthquake, non-linear history spectrum, SAP2000 software, structural response, viscous damper

Procedia PDF Downloads 104
494 Reliability and Validity of a Portable Inertial Sensor and Pressure Mat System for Measuring Dynamic Balance Parameters during Stepping

Authors: Emily Rowe

Abstract:

Introduction: Balance assessments can be used to help evaluate a person’s risk of falls, determine causes of balance deficits and inform intervention decisions. It is widely accepted that instrumented quantitative analysis can be more reliable and specific than semi-qualitative ordinal scales or itemised scoring methods. However, the uptake of quantitative methods is hindered by expense, lack of portability, and set-up requirements. During stepping, foot placement is actively coordinated with the body centre of mass (COM) kinematics during pre-initiation. Based on this, the potential to use COM velocity just prior to foot off and foot placement error as an outcome measure of dynamic balance is currently being explored using complex 3D motion capture. Inertial sensors and pressure mats might be more practical technologies for measuring these parameters in clinical settings. Objective: The aim of this study was to test the criterion validity and test-retest reliability of a synchronised inertial sensor and pressure mat-based approach to measure foot placement error and COM velocity while stepping. Methods: Trials were held with 15 healthy participants who each attended for two sessions. The trial task was to step onto one of 4 targets (2 for each foot) multiple times in a random, unpredictable order. The stepping target was cued using an auditory prompt and electroluminescent panel illumination. Data was collected using 3D motion capture and a combined inertial sensor-pressure mat system simultaneously in both sessions. To assess the reliability of each system, ICC estimates and their 95% confident intervals were calculated based on a mean-rating (k = 2), absolute-agreement, 2-way mixed-effects model. To test the criterion validity of the combined inertial sensor-pressure mat system against the motion capture system multi-factorial two-way repeated measures ANOVAs were carried out. Results: It was found that foot placement error was not reliably measured between sessions by either system (ICC 95% CIs; motion capture: 0 to >0.87 and pressure mat: <0.53 to >0.90). This could be due to genuine within-subject variability given the nature of the stepping task and brings into question the suitability of average foot placement error as an outcome measure. Additionally, results suggest the pressure mat is not a valid measure of this parameter since it was statistically significantly different from and much less precise than the motion capture system (p=0.003). The inertial sensor was found to be a moderately reliable (ICC 95% CIs >0.46 to >0.95) but not valid measure for anteroposterior and mediolateral COM velocities (AP velocity: p=0.000, ML velocity target 1 to 4: p=0.734, 0.001, 0.000 & 0.376). However, it is thought that with further development, the COM velocity measure validity could be improved. Possible options which could be investigated include whether there is an effect of inertial sensor placement with respect to pelvic marker placement or implementing more complex methods of data processing to manage inherent accelerometer and gyroscope limitations. Conclusion: The pressure mat is not a suitable alternative for measuring foot placement errors. The inertial sensors have the potential for measuring COM velocity; however, further development work is needed.

Keywords: dynamic balance, inertial sensors, portable, pressure mat, reliability, stepping, validity, wearables

Procedia PDF Downloads 134
493 Blood Pressure Level, Targeted Blood Pressure Control Rate, and Factors Related to Blood Pressure Control in Post-Acute Ischemic Stroke Patients

Authors: Nannapus Saramad, Rewwadee Petsirasan, Jom Suwanno

Abstract:

Background: This retrospective study design was to describe average blood pressure, blood pressure level, target blood pressure control rate post-stroke BP control in the year following discharge from Sichon hospital, Sichon District, Nakhon Si Thammarat province. The secondary data analysis was employed from the patient’s health records with patient or caregiver interview. A total of 232 eligible post-acute ischemic strokes in the year following discharge (2017-2018) were recruited. Methods: Data analyses were applied to identify the relationship values of single variables were determined through univariate analyses: The Chi-square test, Fisher exact test, the variables found to have a p-value < 0.2 were analyzed by the binary logistic regression Results: Most of the patients in this study were men 61.6%, an average age of 65.4 ± 14.8 years. Systolic blood pressure levels were in the grade 1-2 hypertension and diastolic pressure at optimal and normal at all times during the initial treatment through the present. The results revealed 25% among the groups under the age of 60 achieved BP control; 36.3% for older than 60 years group; and 27.9% for diabetic group. The multivariate analysis revealed the final relationship of four significant variables: 1) receiving calcium-channel blocker (p =.027); 2) medication adherence of antihypertensive (p = .024) 3) medication adherence of antiplatelet ( p = .020); and 4) medication behavior ( p = . 010) . Conclusion: The medical nurse and health care provider should promote their adherence to behavior to improve their blood pressure control.

Keywords: acute ischemic stroke, target blood pressure control, medication adherence, recurrence stroke

Procedia PDF Downloads 109
492 Discrete Crack Modeling of Side Face FRP-Strengthened Concrete Beam

Authors: Shahriar Shahbazpanahi, Mohammad Hemen Jannaty, Alaleh Kamgar

Abstract:

Shear strengthening can be carried out in concrete structures by external fibre reinforced polymer (FRP). In the present investigation, a new fracture mechanics model is developed to model side face of strengthened concrete beam by external FRP. Discrete crack is simulated by a spring element with softening behavior ahead of the crack tip to model the cohesive zone in concrete. A truss element is used, parallel to the spring element, to simulate the energy dissipation rate by the FRP. The strain energy release rate is calculated directly by using a virtual crack closure technique and then, the crack propagation criterion is presented. The results are found acceptable when compared to previous experimental results and ABAQUS software data. It is observed that the length of the fracture process zone (FPZ) increases with the application of FRP in side face at the same load in comparison with that of the control beam.

Keywords: FPZ, fracture, FRP, shear

Procedia PDF Downloads 522
491 Eco-Friendly Textiles: The Power of Natural Dyes

Authors: Bushra

Abstract:

This paper explores the historical significance, ecological benefits, and contemporary applications of natural dyes in textile dyeing, aiming to provide a comprehensive overview of their potential to contribute to a sustainable fashion industry while minimizing ecological footprints. This research explores the potential of natural dyes as a sustainable alternative to synthetic dyes in the textile industry, examining their historical context, sources, and environmental benefits. Natural dyes come from plants, animals, and minerals, including roots, leaves, bark, fruits, flowers, insects, and metal salts, used as mordants to fix dyes to fabrics. Natural dyeing involves extraction, mordanting, and dyeing techniques. Optimizing these processes can enhance the performance of natural dyes, making them viable for contemporary textile applications based on experimental research. Natural dyes offer eco-friendly benefits like biodegradability, non-toxicity, and resource renewables, reducing pollution, promoting biodiversity, and reducing reliance on petrochemicals. Natural dyes offer benefits but face challenges in color consistency, scalability, and performance, requiring industrial production to meet modern consumer standards for durability and colorfastness. Contemporary initiatives in the textile industry include fashion brands like Eileen Fisher and Patagonia incorporating natural dyes, artisans like India Flint's Botanical Alchemy promoting traditional dyeing techniques, and research projects like the European Union's Horizon 2020 program. Natural dyes offer a sustainable textile industry solution, reducing environmental impact and promoting harmony with nature. Research and innovation are paving the way for widespread adoption, transforming textile dyeing.

Keywords: historical significance, textile industry, natural dyes, sustainability

Procedia PDF Downloads 22
490 Microbial Electrochemical Remediation System: Integrating Wastewater Treatment with Simultaneous Power Generation

Authors: Monika Sogani, Zainab Syed, Adrian C. Fisher

Abstract:

Pollution of estrogenic compounds has caught the attention of researchers as the slight increase of estrogens in the water bodies has a significant impact on the aquatic system. They belong to a class of endocrine disrupting compounds (EDCs) and are able to mimic hormones or interfere with the action of endogenous hormones. The microbial electrochemical remediation system (MERS) is employed here for exploiting an electrophototrophic bacterium for evaluating the capacity of biodegradation of ethinylestradiol hormone (EE2) under anaerobic conditions with power generation. MERS using electro-phototrophic bacterium offers a tailored solution of wastewater treatment in a developing country like India which has a huge solar potential. It is a clean energy generating technology as they require only sunlight, water, nutrients, and carbon dioxide to operate. Its main feature that makes it superior over other technologies is that the main fuel for this MERS is sunlight which is indefinitely present. When grown in light with organic compounds, these photosynthetic bacteria generate ATP by cyclic photophosphorylation and use carbon compounds to make cell biomass (photoheterotrophic growth). These cells showed EE2 degradation and were able to generate hydrogen as part of the process of nitrogen fixation. The two designs of MERS were studied, and a maximum of 88.45% decrease in EE2 was seen in a total period of 14 days in the better design. This research provides a better insight into microbial electricity generation and self-sustaining wastewater treatment facilities. Such new models of waste treatment aiming waste to energy generation needs to be followed and implemented for building a resource efficient and sustainable economy.

Keywords: endocrine disrupting compounds, ethinylestradiol, microbial electrochemical remediation systems, wastewater treatment

Procedia PDF Downloads 103
489 Phase II Monitoring of First-Order Autocorrelated General Linear Profiles

Authors: Yihua Wang, Yunru Lai

Abstract:

Statistical process control has been successfully applied in a variety of industries. In some applications, the quality of a process or product is better characterized and summarized by a functional relationship between a response variable and one or more explanatory variables. A collection of this type of data is called a profile. Profile monitoring is used to understand and check the stability of this relationship or curve over time. The independent assumption for the error term is commonly used in the existing profile monitoring studies. However, in many applications, the profile data show correlations over time. Therefore, we focus on a general linear regression model with a first-order autocorrelation between profiles in this study. We propose an exponentially weighted moving average charting scheme to monitor this type of profile. The simulation study shows that our proposed methods outperform the existing schemes based on the average run length criterion.

Keywords: autocorrelation, EWMA control chart, general linear regression model, profile monitoring

Procedia PDF Downloads 440
488 Optimal Peer-to-Peer On-Orbit Refueling Mission Planning with Complex Constraints

Authors: Jing Yu, Hongyang Liu, Dong Hao

Abstract:

On-Orbit Refueling is of great significance in extending space crafts' lifetime. The problem of minimum-fuel, time-fixed, Peer-to-Peer On-Orbit Refueling mission planning is addressed here with the particular aim of assigning fuel-insufficient satellites to the fuel-sufficient satellites and optimizing each rendezvous trajectory. Constraints including perturbation, communication link, sun illumination, hold points for different rendezvous phases, and sensor switching are considered. A planning model has established as well as a two-level solution method. The upper level deals with target assignment based on fuel equilibrium criterion, while the lower level solves constrained trajectory optimization using special maneuver strategies. Simulations show that the developed method could effectively resolve the Peer-to-Peer On-Orbit Refueling mission planning problem and deal with complex constraints.

Keywords: mission planning, orbital rendezvous, on-orbit refueling, space mission

Procedia PDF Downloads 209
487 On the Study of All Waterloo Automaton Semilattices

Authors: Mikhail Abramyan, Boris Melnikov

Abstract:

The aim is to study the set of subsets of grids of the Waterloo automaton and the set of covering automata defined by the grid subsets. The study was carried out using the library for working with nondeterministic finite automata NFALib implemented by one of the authors (M. Abramyan) in C#. The results are regularities obtained when considering semilattices of covering automata for the Waterloo automaton. A complete description of the obtained semilattices from the point of view of equivalence of the covering automata to the original Waterloo automaton is given, the criterion of equivalence of the covering automaton to the Waterloo automaton in terms of properties of the subset of grids defining the covering automaton is formulated. The relevance of the subject area under consideration is due to the need to research a set of regular languages and, in particular, a description of their various subclasses. Also relevant are the problems that may arise in some subclasses. This will give, among other things, the possibility of describing new algorithms for the equivalent transformation of nondeterministic finite automata.

Keywords: nondeterministic finite automata, universal automaton, grid, covering automaton, equivalent transformation algorithms, the Waterloo automaton

Procedia PDF Downloads 64
486 The Implementation of Character Education in Code Riverbanks, Special Region of Yogyakarta, Indonesia

Authors: Ulil Afidah, Muhamad Fathan Mubin, Firdha Aulia

Abstract:

Code riverbanks Yogyakarta is a settlement area with middle to lower social classes. Socio-economic situation is affecting the behavior of society. This research aimed to find and explain the implementation and the assessment of character education which were done in elementary schools in Code riverside, Yogyakarta region of Indonesia. This research is a qualitative research which the subjects were the kids of Code riverbanks, Yogyakarta. The data were collected through interviews and document studies and analyzed qualitatively using the technique of interactive analysis model of Miles and Huberman. The results show that: (1) The learning process of character education was done by integrating all aspects such as democratic and interactive learning session also introducing role model to the students. 2) The assessment of character education was done by teacher based on teaching and learning process and an activity in outside the classroom that was the criterion on three aspects: Cognitive, affective and psychomotor.

Keywords: character, Code riverbanks, education, Yogyakarta

Procedia PDF Downloads 233
485 Shield Tunnel Excavation Simulation of a Case Study Using a So-Called 'Stress Relaxation' Method

Authors: Shengwei Zhu, Alireza Afshani, Hirokazu Akagi

Abstract:

Ground surface settlement induced by shield tunneling is addressing increasing attention as shield tunneling becomes a popular construction technique for tunnels in urban areas. This paper discusses a 2D longitudinal FEM simulation of a tunneling case study in Japan (Tokyo Metro Yurakucho Line). Tunneling-induced field data was already collected and is used here for comparison and evaluating purposes. In this model, earth pressure, face pressure, backfilling grouting, elastic tunnel lining, and Mohr-Coulomb failure criterion for soil elements are considered. A method called ‘stress relaxation’ is also exploited to simulate the gradual tunneling excavation. Ground surface settlements obtained from numerical results using the introduced method are then compared with the measurement data.

Keywords: 2D longitudinal FEM model, tunneling case study, stress relaxation, shield tunneling excavation

Procedia PDF Downloads 313
484 Multi-Criteria Goal Programming Model for Sustainable Development of India

Authors: Irfan Ali, Srikant Gupta, Aquil Ahmed

Abstract:

Every country needs a sustainable development (SD) for its economic growth by forming suitable policies and initiative programs for the development of different sectors of the country. This paper is comprised of modeling and optimization of different sectors of India that form a multi-criterion model. In this paper, we developed a fractional goal programming (FGP) model that helps in providing the efficient allocation of resources simultaneously by achieving the sustainable goals in gross domestic product (GDP), electricity consumption (EC) and greenhouse gasses (GHG) emission by the year 2030. Also, a weighted model of FGP is presented to obtain varying solution according to the priorities set by the policy maker for achieving future goals of GDP growth, EC, and GHG emission. The presented models provide a useful insight to the decision makers for implementing strategies in a different sector.

Keywords: sustainable and economic development, multi-objective fractional programming, fuzzy goal programming, weighted fuzzy goal programming

Procedia PDF Downloads 206
483 Determination of Gross Alpha and Gross Beta Activity in Water Samples by iSolo Alpha/Beta Counting System

Authors: Thiwanka Weerakkody, Lakmali Handagiripathira, Poshitha Dabare, Thisari Guruge

Abstract:

The determination of gross alpha and beta activity in water is important in a wide array of environmental studies and these parameters are considered in international legislations on the quality of water. This technique is commonly applied as screening method in radioecology, environmental monitoring, industrial applications, etc. Measuring of Gross Alpha and Beta emitters by using iSolo alpha beta counting system is an adequate nuclear technique to assess radioactivity levels in natural and waste water samples due to its simplicity and low cost compared with the other methods. Twelve water samples (Six samples of commercially available bottled drinking water and six samples of industrial waste water) were measured by standard method EPA 900.0 consisting of the gas-less, firm wear based, single sample, manual iSolo alpha beta counter (Model: SOLO300G) with solid state silicon PIPS detector. Am-241 and Sr90/ Y90 calibration standards were used to calibrate the detector. The minimum detectable activities are 2.32mBq/L and 406mBq/L, for alpha and beta activity, respectively. Each of the 2L water samples was evaporated (at low heat) to a small volume and transferred into 50mm stainless steel counting planchet evenly (for homogenization) and heated by IR lamp and the constant weighted residue was obtained. Then the samples were counted for gross alpha and beta. Sample density on the planchet area was maintained below 5mg/cm. Large quantities of solid wastes sludges and waste water are generated every year due to various industries. This water can be reused for different applications. Therefore implementation of water treatment plants and measuring water quality parameters in industrial waste water discharge is very important before releasing them into the environment. This waste may contain different types of pollutants, including radioactive substances. All these measured waste water samples having gross alpha and beta activities, lower than the maximum tolerance limits for industrial waste water discharge of industrial waste in to inland surface water, that is 10-9µCi/mL and 10-8µCi/mL for gross alpha and beta respectively (National Environmental Act, No. 47 of 1980). This is according to extraordinary gazette of the democratic socialist republic of Sri Lanka in February 2008. The measured water samples were below the recommended radioactivity levels and do not pose any radiological hazard when releasing the environment. Drinking water is an essential requirement of life. All the drinking water samples were below the permissible levels of 0.5Bq/L for gross alpha activity and 1Bq/L for gross beta activity. The values have been proposed by World Health Organization in 2011; therefore the water is acceptable for consumption of humans without any further clarification with respect to their radioactivity. As these screening levels are very low, the individual dose criterion (IDC) would usually not be exceeded (0.1mSv y⁻¹). IDC is a criterion for evaluating health risks from long term exposure to radionuclides in drinking water. Recommended level of 0.1mSv/y expressed a very low level of health risk. This monitoring work will be continued further for environmental protection purposes.

Keywords: drinking water, gross alpha, gross beta, waste water

Procedia PDF Downloads 178
482 Collapse Performance of Steel Frame with Hysteric Energy Dissipating Devices

Authors: Hyung-Joon Kim, Jin-Young Park

Abstract:

Energy dissipating devices (EDDs) have become more popular as seismic-force-resisting systems for building structures. However, there is little information on the collapse capacities of frames employing EDDs which are an important criterion for their seismic design. This study investigates the collapse capacities of steel frames with TADAS hysteric energy dissipative devices (HEDDs) that become an alternative to steel braced frames. To do this, 5-story steel ordinary concentrically braced frame and steel frame with HEDDs are designed and modeled. Nonlinear dynamic analyses and incremental dynamic analysis with 40 ground motions scaled to maximum considered earthquake are carried out. It is shown from analysis results that the significant enhancement in terms of the collapse capacities is found due to the introduction HEDDs.

Keywords: collapse capacity, incremental dynamic analysis, steel braced frame, TADAS hysteric energy dissipative device

Procedia PDF Downloads 470
481 Preference Heterogeneity as a Positive Rather Than Negative Factor towards Acceptable Monitoring Schemes: Co-Management of Artisanal Fishing Communities in Vietnam

Authors: Chi Nguyen Thi Quynh, Steven Schilizzi, Atakelty Hailu, Sayed Iftekhar

Abstract:

Territorial Use Rights for Fisheries (TURFs) have been emerged as a promising tool for fisheries conservation and management. However, illegal fishing has undermined the effectiveness of TURFs, profoundly degrading global fish stocks and marine ecosystems. Conservation and management of fisheries, therefore, largely depends on effectiveness of enforcing fishing regulations, which needs co-enforcement by fishers. However, fishers tend to resist monitoring participation, as their views towards monitoring scheme design has not been received adequate attention. Fishers’ acceptability of a monitoring scheme is likely to be achieved if there is a mechanism allowing fishers to engage in the early planning and design stages. This study carried out a choice experiment with 396 fishers in Vietnam to elicit fishers’ preferences for monitoring scheme and to estimate the relative importance that fishers place on the key design elements. Preference heterogeneity was investigated using a Scale-Adjusted Latent Class Model that accounts for both preference and scale variance. Welfare changes associated with the proposed monitoring schemes were also examined. It is found that there are five distinct preference classes, suggesting that there is no one-size-fits-all scheme well-suited to all fishers. Although fishers prefer to be compensated more for their participation, compensation is not a driving element affecting fishers’ choice. Most fishers place higher value on other elements, such as institutional arrangements and monitoring capacity. Fishers’ preferences are driven by their socio-demographic and psychological characteristics. Understanding of how changes in design elements’ levels affect the participation of fishers could provide policy makers with insights useful for monitoring scheme designs tailored to the needs of different fisher classes.

Keywords: Design of monitoring scheme, Enforcement, Heterogeneity, Illegal Fishing, Territorial Use Rights for Fisheries

Procedia PDF Downloads 311
480 Robust Pattern Recognition via Correntropy Generalized Orthogonal Matching Pursuit

Authors: Yulong Wang, Yuan Yan Tang, Cuiming Zou, Lina Yang

Abstract:

This paper presents a novel sparse representation method for robust pattern classification. Generalized orthogonal matching pursuit (GOMP) is a recently proposed efficient sparse representation technique. However, GOMP adopts the mean square error (MSE) criterion and assign the same weights to all measurements, including both severely and slightly corrupted ones. To reduce the limitation, we propose an information-theoretic GOMP (ITGOMP) method by exploiting the correntropy induced metric. The results show that ITGOMP can adaptively assign small weights on severely contaminated measurements and large weights on clean ones, respectively. An ITGOMP based classifier is further developed for robust pattern classification. The experiments on public real datasets demonstrate the efficacy of the proposed approach.

Keywords: correntropy induced metric, matching pursuit, pattern classification, sparse representation

Procedia PDF Downloads 338
479 Spatio-Temporal Analysis and Mapping of Malaria in Thailand

Authors: Krisada Lekdee, Sunee Sammatat, Nittaya Boonsit

Abstract:

This paper proposes a GLMM with spatial and temporal effects for malaria data in Thailand. A Bayesian method is used for parameter estimation via Gibbs sampling MCMC. A conditional autoregressive (CAR) model is assumed to present the spatial effects. The temporal correlation is presented through the covariance matrix of the random effects. The malaria quarterly data have been extracted from the Bureau of Epidemiology, Ministry of Public Health of Thailand. The factors considered are rainfall and temperature. The result shows that rainfall and temperature are positively related to the malaria morbidity rate. The posterior means of the estimated morbidity rates are used to construct the malaria maps. The top 5 highest morbidity rates (per 100,000 population) are in Trat (Q3, 111.70), Chiang Mai (Q3, 104.70), Narathiwat (Q4, 97.69), Chiang Mai (Q2, 88.51), and Chanthaburi (Q3, 86.82). According to the DIC criterion, the proposed model has a better performance than the GLMM with spatial effects but without temporal terms.

Keywords: Bayesian method, generalized linear mixed model (GLMM), malaria, spatial effects, temporal correlation

Procedia PDF Downloads 434
478 Functioning of Public Distribution System and Calories Intake in the State of Maharashtra

Authors: Balasaheb Bansode, L. Ladusingh

Abstract:

The public distribution system is an important component of food security. It is a massive welfare program undertaken by Government of India and implemented by state government since India being a federal state; for achieving multiple objectives like eliminating hunger, reduction in malnutrition and making food consumption affordable. This program reaches at the community level through the various agencies of the government. The paper focuses on the accessibility of PDS at household level and how the present policy framework results in exclusion and inclusion errors. It tries to explore the sanctioned food grain quantity received by differentiated ration cards according to income criterion at household level, and also it has highlighted on the type of corruption in food distribution that is generated by the PDS system. The data used is of secondary nature from NSSO 68 round conducted in 2012. Bivariate and multivariate techniques have been used to understand the working and consumption of food for this paper.

Keywords: calories intake, entitle food quantity, poverty aliviation through PDS, target error

Procedia PDF Downloads 313
477 A Packet Loss Probability Estimation Filter Using Most Recent Finite Traffic Measurements

Authors: Pyung Soo Kim, Eung Hyuk Lee, Mun Suck Jang

Abstract:

A packet loss probability (PLP) estimation filter with finite memory structure is proposed to estimate the packet rate mean and variance of the input traffic process in real-time while removing undesired system and measurement noises. The proposed PLP estimation filter is developed under a weighted least square criterion using only the finite traffic measurements on the most recent window. The proposed PLP estimation filter is shown to have several inherent properties such as unbiasedness, deadbeat, robustness. A guideline for choosing appropriate window length is described since it can affect significantly the estimation performance. Using computer simulations, the proposed PLP estimation filter is shown to be superior to the Kalman filter for the temporarily uncertain system. One possible explanation for this is that the proposed PLP estimation filter can have greater convergence time of a filtered estimate as the window length M decreases.

Keywords: packet loss probability estimation, finite memory filter, infinite memory filter, Kalman filter

Procedia PDF Downloads 658
476 Determining the Constituents of the Sunnah of Prophet Muhammad (pbuh) in the Light of the Quran: A Clinical Approach

Authors: Aamir I. Yazdani, Dr. Muhammad Nasir J. Qureshi

Abstract:

The term Sunnah has been used both, for Allah Himself and for his messengers in the Quran. The way Allah dealt with people where the messengers (rasuls) were sent is called Sunnatullāh by the Quran. Likewise, the same term is used in the Quran, for Prophet Muhammad (pbuh) as in following the trodden path (Sunnah) of his forefather Prophet Abraham, Alaihissalam. It implies; therefore, the word Sunnah cannot be applied to things which relates to theoretical knowledge like faith etc. Its ambit remains the practices, actions linked to practical things only. In the case of the Quran, we find that there is complete agreement among all Muslims on what constitutes the book of Allah, based on ijma (unanimity, total agreement, consensus) and tawatur (uninterrupted continuity, without any gap). There seems to be no unanimity on the question on what constitutes Sunnah of Prophet Muhammad (pbuh). There are, therefore, several approaches towards Sunnah adopted by Muslims. This paper is based on Qualitative Methodology to determine the criterion of what constitutes the Sunnah of the Prophet Muhammad (pbuh) and which practices constitute the precincts of the Sunnah of the Prophet Muhammad (pbuh).

Keywords: Al-hikmah, Hereafter, practices, Tazkiya

Procedia PDF Downloads 119
475 Nonparametric Copula Approximations

Authors: Serge Provost, Yishan Zang

Abstract:

Copulas are currently utilized in finance, reliability theory, machine learning, signal processing, geodesy, hydrology and biostatistics, among several other fields of scientific investigation. It follows from Sklar's theorem that the joint distribution function of a multidimensional random vector can be expressed in terms of its associated copula and marginals. Since marginal distributions can easily be determined by making use of a variety of techniques, we address the problem of securing the distribution of the copula. This will be done by using several approaches. For example, we will obtain bivariate least-squares approximations of the empirical copulas, modify the kernel density estimation technique and propose a criterion for selecting appropriate bandwidths, differentiate linearized empirical copulas, secure Bernstein polynomial approximations of suitable degrees, and apply a corollary to Sklar's result. Illustrative examples involving actual observations will be presented. The proposed methodologies will as well be applied to a sample generated from a known copula distribution in order to validate their effectiveness.

Keywords: copulas, Bernstein polynomial approximation, least-squares polynomial approximation, kernel density estimation, density approximation

Procedia PDF Downloads 52
474 Relationship among Teams' Information Processing Capacity and Performance in Information System Projects: The Effects of Uncertainty and Equivocality

Authors: Ouafa Sakka, Henri Barki, Louise Cote

Abstract:

Uncertainty and equivocality are defined in the information processing literature as two task characteristics that require different information processing responses from managers. As uncertainty often stems from a lack of information, addressing it is thought to require the collection of additional data. On the other hand, as equivocality stems from ambiguity and a lack of understanding of the task at hand, addressing it is thought to require rich communication between those involved. Past research has provided weak to moderate empirical support to these hypotheses. The present study contributes to this literature by defining uncertainty and equivocality at the project level and investigating their moderating effects on the association between several project information processing constructs and project performance. The information processing constructs considered are the amount of information collected by the project team, and the richness and frequency of formal communications among the team members to discuss the project’s follow-up reports. Data on 93 information system development (ISD) project managers was collected in a questionnaire survey and analyzed it via the Fisher Test for correlation differences. The results indicate that the highest project performance levels were observed in projects characterized by high uncertainty and low equivocality in which project managers were provided with detailed and updated information on project costs and schedules. In addition, our findings show that information about user needs and technical aspects of the project is less useful to managing projects where uncertainty and equivocality are high. Further, while the strongest positive effect of interactive use of follow-up reports on performance occurred in projects where both uncertainty and equivocality levels were high, its weakest effect occurred when both of these were low.

Keywords: uncertainty, equivocality, information processing model, management control systems, project control, interactive use, diagnostic use, information system development

Procedia PDF Downloads 269