Search results for: statistical optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2977

Search results for: statistical optimization

397 Comparison of Two Maintenance Policies for a Two-Unit Series System Considering General Repair

Authors: Seyedvahid Najafi, Viliam Makis

Abstract:

In recent years, maintenance optimization has attracted special attention due to the growth of industrial systems complexity. Maintenance costs are high for many systems, and preventive maintenance is effective when it increases operations' reliability and safety at a reduced cost. The novelty of this research is to consider general repair in the modeling of multi-unit series systems and solve the maintenance problem for such systems using the semi-Markov decision process (SMDP) framework. We propose an opportunistic maintenance policy for a series system composed of two main units. Unit 1, which is more expensive than unit 2, is subjected to condition monitoring, and its deterioration is modeled using a gamma process. Unit 1 hazard rate is estimated by the proportional hazards model (PHM), and two hazard rate control limits are considered as the thresholds of maintenance interventions for unit 1. Maintenance is performed on unit 2, considering an age control limit. The objective is to find the optimal control limits and minimize the long-run expected average cost per unit time. The proposed algorithm is applied to a numerical example to compare the effectiveness of the proposed policy (policy Ⅰ) with policy Ⅱ, which is similar to policy Ⅰ, but instead of general repair, replacement is performed. Results show that policy Ⅰ leads to lower average cost compared with policy Ⅱ. 

Keywords: Condition-based maintenance, proportional hazards model, semi-Markov decision process, two-unit series systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 568
396 Event Information Extraction System (EIEE): FSM vs HMM

Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani

Abstract:

Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.

Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2309
395 Diagnosis of the Abdominal Aorta Aneurysm in Magnetic Resonance Imaging Images

Authors: W. Kultangwattana, K. Somkantha, P. Phuangsuwan

Abstract:

This paper presents a technique for diagnosis of the abdominal aorta aneurysm in magnetic resonance imaging (MRI) images. First, our technique is designed to segment the aorta image in MRI images. This is a required step to determine the volume of aorta image which is the important step for diagnosis of the abdominal aorta aneurysm. Our proposed technique can detect the volume of aorta in MRI images using a new external energy for snakes model. The new external energy for snakes model is calculated from Law-s texture. The new external energy can increase the capture range of snakes model efficiently more than the old external energy of snakes models. Second, our technique is designed to diagnose the abdominal aorta aneurysm by Bayesian classifier which is classification models based on statistical theory. The feature for data classification of abdominal aorta aneurysm was derived from the contour of aorta images which was a result from segmenting of our snakes model, i.e., area, perimeter and compactness. We also compare the proposed technique with the traditional snakes model. In our experiment results, 30 images are trained, 20 images are tested and compared with expert opinion. The experimental results show that our technique is able to provide more accurate results than 95%.

Keywords: Adbominal Aorta Aneurysm, Bayesian Classifier, Snakes Model, Texture Feature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1585
394 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other.

As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
393 The Effectiveness of Banks’ Web Sites: A Study of Turkish Banking Sector

Authors: Raif Parlakkaya, Huseyin Cetin, Duygu Irdiren

Abstract:

By the development of World Wide Web, the usage rate of Internet has rapidly grown globally; and provided a basis for the emergence of electronic business. As well as other sectors, the banking sector has adopted the use of internet with the developments in information and communication technologies. Due to the public disclosure and transparency principle of Corporate Governance, the importance of information disclosure of banks on their web sites has increased significantly. For the purpose of this study, a Bank Disclosure Attribute Index (BDAI) in Turkey has been constructed through classifying the information disclosure on banks’ web sites into general, financial, investors and corporate governance attributes. All 47 banks in Turkish Banking System have been evaluated according to the index with the aim of providing a comparison between banks. By Chi Square Test, Pearson Correlation, T-Test, and ANOVA statistical tools, it has been concluded that the majority of banks in Turkey have shared information on their web sites adequately with respect to their total index score. Although there is a positive correlation between various types of information on banks’ web sites, there is no uniformity among them. Also, no significant difference between various types of information disclosure and bank types has been observed. Compared with the total index score averages of the five largest banks in Turkey, there are some banks that need to improve the content of their web sites.

Keywords: Banking sector, public disclosure, Turkey, web site evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421
392 Modeling and Analysis of Adaptive Buffer Sharing Scheme for Consecutive Packet Loss Reduction in Broadband Networks

Authors: Sakshi Kausha, R.K Sharma

Abstract:

High speed networks provide realtime variable bit rate service with diversified traffic flow characteristics and quality requirements. The variable bit rate traffic has stringent delay and packet loss requirements. The burstiness of the correlated traffic makes dynamic buffer management highly desirable to satisfy the Quality of Service (QoS) requirements. This paper presents an algorithm for optimization of adaptive buffer allocation scheme for traffic based on loss of consecutive packets in data-stream and buffer occupancy level. Buffer is designed to allow the input traffic to be partitioned into different priority classes and based on the input traffic behavior it controls the threshold dynamically. This algorithm allows input packets to enter into buffer if its occupancy level is less than the threshold value for priority of that packet. The threshold is dynamically varied in runtime based on packet loss behavior. The simulation is run for two priority classes of the input traffic – realtime and non-realtime classes. The simulation results show that Adaptive Partial Buffer Sharing (ADPBS) has better performance than Static Partial Buffer Sharing (SPBS) and First In First Out (FIFO) queue under the same traffic conditions.

Keywords: Buffer Management, Consecutive packet loss, Quality-of-Service, Priority based packet discarding, partial buffersharing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630
391 Enhanced GA-Fuzzy OPF under both Normal and Contingent Operation States

Authors: Ashish Saini, A.K. Saxena

Abstract:

The genetic algorithm (GA) based solution techniques are found suitable for optimization because of their ability of simultaneous multidimensional search. Many GA-variants have been tried in the past to solve optimal power flow (OPF), one of the nonlinear problems of electric power system. The issues like convergence speed and accuracy of the optimal solution obtained after number of generations using GA techniques and handling system constraints in OPF are subjects of discussion. The results obtained for GA-Fuzzy OPF on various power systems have shown faster convergence and lesser generation costs as compared to other approaches. This paper presents an enhanced GA-Fuzzy OPF (EGAOPF) using penalty factors to handle line flow constraints and load bus voltage limits for both normal network and contingency case with congestion. In addition to crossover and mutation rate adaptation scheme that adapts crossover and mutation probabilities for each generation based on fitness values of previous generations, a block swap operator is also incorporated in proposed EGA-OPF. The line flow limits and load bus voltage magnitude limits are handled by incorporating line overflow and load voltage penalty factors respectively in each chromosome fitness function. The effects of different penalty factors settings are also analyzed under contingent state.

Keywords: Contingent operation state, Fuzzy rule base, Genetic Algorithms, Optimal Power Flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1605
390 A Neurofuzzy Learning and its Application to Control System

Authors: Seema Chopra, R. Mitra, Vijay Kumar

Abstract:

A neurofuzzy approach for a given set of input-output training data is proposed in two phases. Firstly, the data set is partitioned automatically into a set of clusters. Then a fuzzy if-then rule is extracted from each cluster to form a fuzzy rule base. Secondly, a fuzzy neural network is constructed accordingly and parameters are tuned to increase the precision of the fuzzy rule base. This network is able to learn and optimize the rule base of a Sugeno like Fuzzy inference system using Hybrid learning algorithm, which combines gradient descent, and least mean square algorithm. This proposed neurofuzzy system has the advantage of determining the number of rules automatically and also reduce the number of rules, decrease computational time, learns faster and consumes less memory. The authors also investigate that how neurofuzzy techniques can be applied in the area of control theory to design a fuzzy controller for linear and nonlinear dynamic systems modelling from a set of input/output data. The simulation analysis on a wide range of processes, to identify nonlinear components on-linely in a control system and a benchmark problem involving the prediction of a chaotic time series is carried out. Furthermore, the well-known examples of linear and nonlinear systems are also simulated under the Matlab/Simulink environment. The above combination is also illustrated in modeling the relationship between automobile trips and demographic factors.

Keywords: Fuzzy control, neuro-fuzzy techniques, fuzzy subtractive clustering, extraction of rules, and optimization of membership functions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2584
389 Holistic Approach to Teaching Mathematics in Secondary School as a Means of Improving Students’ Comprehension of Study Material

Authors: Natalia Podkhodova, Olga Sheremeteva, Mariia Soldaeva

Abstract:

Creating favourable conditions for students’ comprehension of mathematical content is one of the primary problems in teaching mathematics in secondary school. The fact of comprehension includes the ability to build a working situational model and thus becomes an important means of solving mathematical problems. This paper describes a holistic approach to teaching mathematics designed to address the primary challenges of such teaching; specifically, the challenge of students’ comprehension. Essentially, this approach consists of (1) establishing links between the attributes of the notion: the sense, the meaning, and the term; (2) taking into account the components of student’s subjective experience—value-based emotions, contextual, procedural and communicative—during the educational process; (3) linking together different ways to present mathematical information; (4) identifying and leveraging the relationships between real, perceptual and conceptual (scientific) mathematical spaces by applying real-life situational modelling. The article describes approaches to the practical use of these foundational concepts. Identifying how proposed methods and techniques influence understanding of material used in teaching mathematics was the primary goal. The study included an experiment in which 256 secondary school students took part: 142 in the study group and 114 in the control group. All students in these groups had similar levels of achievement in math and studied math under the same curriculum. In the course of the experiment, comprehension of two topics — “Derivative” and “Trigonometric functions”—was evaluated. Control group participants were taught using traditional methods. Students in the study group were taught using the holistic method: under teacher’s guidance, they carried out assignments designed to establish linkages between notion’s characteristics, to convert information from one mode of presentation to another, as well as assignments that required the ability to operate with all modes of presentation. Identification, accounting for and transformation of subjective experience were associated with methods of stimulating the emotional value component of the studied mathematical content (discussions of lesson titles, assignments aimed to create study dominants, performing theme-related physical exercise ...) The use of techniques that forms inter-subject notions based on linkages between, perceptual real and mathematical conceptual spaces proved to be of special interest to the students. Results of the experiment were analysed by presenting students in each of the groups with a final test in each of the studied topics. The test included assignments that required building real situational models. Statistical analysis was used to aggregate test results. Pierson criterion x2 was used to reveal statistics significance of results (pass-fail the modelling test). Significant difference of results was revealed (p < 0.001), which allowed to conclude that students in the study group showed better comprehension of mathematical information than those in the control group. The total number of completed assignments of each student was analysed as well, with average results calculated for each group. Statistical significance of result differences against the quantitative criterion (number of completed assignments) was determined using Student’s t-test, which showed that students in the study group completed significantly more assignments than those in the control group (p = 0.0001). Authors thus come to the conclusion that suggested increase in the level of comprehension of study material took place as a result of applying implemented methods and techniques.

Keywords: Comprehension of mathematical content, holistic approach to teaching mathematics in secondary school, subjective experience, technology of the formation of inter-subject notions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 576
388 Maya Semantic Technique: A Mathematical Technique Used to Determine Partial Semantics for Declarative Sentences

Authors: Marcia T. Mitchell

Abstract:

This research uses computational linguistics, an area of study that employs a computer to process natural language, and aims at discerning the patterns that exist in declarative sentences used in technical texts. The approach is mathematical, and the focus is on instructional texts found on web pages. The technique developed by the author and named the MAYA Semantic Technique is used here and organized into four stages. In the first stage, the parts of speech in each sentence are identified. In the second stage, the subject of the sentence is determined. In the third stage, MAYA performs a frequency analysis on the remaining words to determine the verb and its object. In the fourth stage, MAYA does statistical analysis to determine the content of the web page. The advantage of the MAYA Semantic Technique lies in its use of mathematical principles to represent grammatical operations which assist processing and accuracy if performed on unambiguous text. The MAYA Semantic Technique is part of a proposed architecture for an entire web-based intelligent tutoring system. On a sample set of sentences, partial semantics derived using the MAYA Semantic Technique were approximately 80% accurate. The system currently processes technical text in one domain, namely Cµ programming. In this domain all the keywords and programming concepts are known and understood.

Keywords: Natural language understanding, computational linguistics, knowledge representation, linguistic theories.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662
387 Assessment of Physicochemical Characteristics and Heavy Metals Concentration in Freshwater from Jega River, Kebbi State, Nigeria

Authors: D. Y. Bawa, M. I. Ribah, I. S. Jega, V. O. Oyedepo

Abstract:

This study was conducted to determine the physicochemical characteristics and heavy metal concentration (Cadmium (Cd), Copper (Cu), Iron (Fe), Lead (Pb) and Zinc (Zn)) in freshwater from Jega river. 30 water samples were collected in two 1-liter sterile plastic containers from three designated sampling points, namely; Station A (before the bridge; upstream), Station B (at the bridge where human activities such as washing of cars, motorbike, clothes, bathing and other household materials are concentrated), Station C (after the bridge; downstream) fortnightly, between March and July 2014. Results indicated that the highest pH mean value of 7.08 ± 1.12 was observed in station C, the highest conductivity with the mean 58.75 ± 7.87 µs/cm was observed at station A, the highest mean value of the water total hardness was observed at station A (54 ± 16.11 mg/L), the highest mean value of nitrate deposit was observed in station A (1.66 ± 1.33 mg/L), the highest mean value of alkalinity was observed at station B (51.33 ± 6.66 mg/L) and the highest mean (39.56 ± 3.24 mg/L) of total dissolved solids was observed at station A. The highest concentration mean value of Fe was observed in station C (65.33 ± 4.50 mg/L), the highest concentrations of Cd was observed in station C (0.99 ± 0.36 mg/L), the mean value of 2.13 ± 1.99 mg/L was the highest concentration of Zn observed in station B, the concentration of Pb was not detected (ND) and the highest concentration of Cu with the mean value of 0.43 ± 0.16 mg/L was observed in station B, while the lowest concentration was observed at station C (0.27 ± 0.26 mg/L). Statistical analysis shows no significant difference (P > 0.05) among the sampling stations for both the physicochemical characteristics and heavy metal concentrations. The results were found to be within the internationally acceptable standard limits.

Keywords: Assessment, freshwater, heavy metal concentration, physicochemical.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1015
386 Careers-Outreach Programmes for Children: Lessons for Perceptions of Engineering and Manufacturing

Authors: Niall J. English, Sylvia Leatham, Maria Isabel Meza Silva, Denis P. Dowling

Abstract:

The training and education of under- and post-graduate students can be promoted by more active learning especially in engineering, overcoming more passive and vicarious experiences and approaches in their documented effectiveness. However, the possibility of outreach to young pupils and school-children in primary and secondary schools is a lesser explored area in terms of Education and Public Engagement (EPE) efforts – as relates to feedback and influence on shaping 3rd-level engineering training and education. Therefore, the outreach and school-visit agenda constitutes an interesting avenue to observe how active learning, careers stimulus and EPE efforts for young children and teenagers can teach the university sector, to improve future engineering-teaching standards and enhance both quality and capabilities of practice. This intervention involved careers-outreach efforts to lead to statistical determinations of motivations towards engineering, manufacturing and training. The aim was to gauge to what extent this intervention would lead to an increased careers awareness in engineering, using the method of the schools-visits programme as the means for so doing. It was found that this led to an increase in engagement by school pupils with engineering as a career option and a greater awareness of the importance of manufacturing. 

Keywords: outreach, education and public engagement, careers, peer interactions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 490
385 Modeling and Analysis of DFIG Based Wind Power System Using Instantaneous Power Components

Authors: Jaimala Gambhir, Tilak Thakur, Puneet Chawla

Abstract:

As per the statistical data, the Doubly-fed Induction Generator (DFIG) based wind turbine with variable speed and variable pitch control is the most common wind turbine in the growing wind market. This machine is usually used on the grid connected wind energy conversion system to satisfy grid code requirements such as grid stability, Fault Ride Through (FRT), power quality improvement, grid synchronization and power control etc. Though the requirements are not fulfilled directly by the machine, the control strategy is used in both the stator as well as rotor side along with power electronic converters to fulfil the requirements stated above. To satisfy the grid code requirements of wind turbine, usually grid side converter is playing a major role. So in order to improve the operation capacity of wind turbine under critical situation, the intensive study of both machine side converter control and grid side converter control is necessary In this paper DFIG is modeled using power components as variables and the performance of the DFIG system is analysed under grid voltage fluctuations. The voltage fluctuations are made by lowering and raising the voltage values in the utility grid intentionally for the purpose of simulation keeping in view of different grid disturbances.

Keywords: DFIG, dynamic modeling, DPC, sag, swell, voltage fluctuations, FRT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2643
384 Large Scale Production of Polyhydroxyalkanoates (PHAs) from Wastewater: A Study of Techno-Economics, Energy Use and Greenhouse Gas Emissions

Authors: Cora Fernandez Dacosta, John A. Posada, Andrea Ramirez

Abstract:

The biodegradable family of polymers polyhydroxyalkanoates is an interesting substitute for convectional fossil-based plastics. However, the manufacturing and environmental impacts associated with their production via intracellular bacterial fermentation are strongly dependent on the raw material used and on energy consumption during the extraction process, limiting their potential for commercialization. Industrial wastewater is studied in this paper as a promising alternative feedstock for waste valorization. Based on results from laboratory and pilot-scale experiments, a conceptual process design, techno-economic analysis and life cycle assessment are developed for the large-scale production of the most common type of polyhydroxyalkanoate, polyhydroxbutyrate. Intracellular polyhydroxybutyrate is obtained via fermentation of microbial community present in industrial wastewater and the downstream processing is based on chemical digestion with surfactant and hypochlorite. The economic potential and environmental performance results help identifying bottlenecks and best opportunities to scale-up the process prior to industrial implementation. The outcome of this research indicates that the fermentation of wastewater towards PHB presents advantages compared to traditional PHAs production from sugars because the null environmental burdens and financial costs of the raw material in the bioplastic production process. Nevertheless, process optimization is still required to compete with the petrochemicals counterparts.

Keywords: Circular economy, life cycle assessment, polyhydroxyalkanoates, waste valorization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4169
383 Preparation of Corn Flour Based Extruded Product and Evaluate Its Physical Characteristics

Authors: C. S. Saini

Abstract:

The composite flour blend consisting of corn, pearl millet, black gram and wheat bran in the ratio of 80:5:10:5 was taken to prepare the extruded product and their effect on physical properties of extrudate was studied. The extrusion process was conducted in laboratory by using twin screw extruder. The physical characteristics evaluated include lateral expansion, bulk density, water absorption index, water solubility index, and rehydration ratio and moisture retention. The Central Composite Rotatable Design (CCRD) was used to decide the level of processing variables i.e. feed moisture content (%), screw speed (rpm), and barrel temperature (oC) for the experiment. The data obtained after extrusion process were analyzed by using response surface methodology. A second order polynomial model for the dependent variables was established to fit the experimental data. The numerical optimization studies resulted in 127°C of barrel temperature, 246 rpm of screw speed, and 14.5% of feed moisture as optimum variables to produce acceptable extruded product. The responses predicted by the software for the optimum process condition resulted in lateral expansion 126%, bulk density 0.28 g/cm3, water absorption index 4.10 g/g, water solubility index 39.90%, rehydration ratio 544% and moisture retention 11.90% with 75% desirability.

Keywords: Black gram, corn flour, extrusion, physical characteristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3300
382 Economical and Technical Analysis of Urban Transit System Selection Using TOPSIS Method According to Constructional and Operational Aspects

Authors: Ali Abdi Kordani, Meysam Rooyintan, Sid Mohammad Boroomandrad

Abstract:

Nowadays, one the most important problems in megacities is public transportation and satisfying citizens from this system in order to decrease the traffic congestions and air pollution. Accordingly, to improve the transit passengers and increase the travel safety, new transportation systems such as Bus Rapid Transit (BRT), tram, and monorail have expanded that each one has different merits and demerits. That is why comparing different systems for a systematic selection of public transportation systems in a big city like Tehran, which has numerous problems in terms of traffic and pollution, is essential. In this paper, it is tried to investigate the advantages and feasibility of using monorail, tram and BRT systems, which are widely used in most of megacities in all over the world. In Tehran, by using SPSS statistical analysis software and TOPSIS method, these three modes are compared to each other and their results will be assessed. Experts, who are experienced in the transportation field, answer the prepared matrix questionnaire to select each public transportation mode (tram, monorail, and BRT). The results according to experts’ judgments represent that monorail has the first priority, Tram has the second one, and BRT has the third one according to the considered indices like execution costs, wasting time, depreciation, pollution, operation costs, travel time, passenger satisfaction, benefit to cost ratio and traffic congestion.

Keywords: Bus Rapid Transit, Costs, Monorail, Pollution, Tram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 666
381 The Relation between the Organizational Trust Level and Organizational Justice Perceptions of Staff in Konya Municipality: A Theoretical and Empirical Study

Authors: Handan Ertaş

Abstract:

The aim of the study is to determine the relationship between organizational trust level and organizational justice of Municipality officials. Correlational method has been used via descriptive survey model and Organizational Justice Perception Scale, Organizational Trust Inventory and Interpersonal Trust Scale have been applied to 353 participants who work in Konya Metropolitan Municipality and central district municipalities in the study. Frequency as statistical method, Independent Samples t test for binary groups, One Way-ANOVA analyses for multi-groups and Pearson Correlation analysis have been used to determine the relation in the data analysis process.It has been determined in the outcomes of the study that participants have high level of organizational trust, “Interpersonal Trust” is in the first place and there is a significant difference in the favor of male officials in terms of Trust on the Organization Itself and Interpersonal Trust. It has also been understood that officials in district municipalities have higher perception level in all dimensions, there is a significant difference in Trust on the Organization sub-dimension and work status is an important factor on organizational trust perception. Moreover, the study has shown that organizational justice implementations are important in raising trust of official on the organization, administrator and colleagues, and there is a parallel relation between Organizational Trust components and Organizational Trust dimensions.

Keywords: Konya, Organizational Justice, Organizational.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862
380 Durian Marker Kit for Durian (Durio zibethinus Murr.) Identity

Authors: Emma K. Sales

Abstract:

Durian is the flagship fruit of Mindanao and there is an abundance of several cultivars with many confusing identities/ names. The project was conducted to develop procedure for reliable and rapid detection and sorting of durian planting materials. Moreover, it is also aimed to establish specific genetic or DNA markers for routine testing and authentication of durian cultivars in question. The project developed molecular procedures for routine testing. SSR primers were also screened and identified for their utility in discriminating durian cultivars collected. Results of the study showed the following accomplishments: 1. Twenty (29) SSR primers were selected and identified based on their ability to discriminate durian cultivars, 2. Optimized and established standard procedure for identification and authentication of Durian cultivars 3. Genetic profile of durian is now available at Biotech Unit Our results demonstrate the relevance of using molecular techniques in evaluating and identifying durian clones. The most polymorphic primers tested in this study could be useful tools for detecting variation even at the early stage of the plant especially for commercial purposes. The process developed combines the efficiency of the microsatellites development process with the optimization of non-radioactive detection process resulting in a user-friendly protocol that can be performed in two (2) weeks and easily incorporated into laboratories about to start microsatellite development projects. This can be of great importance to extend microsatellite analyses to other crop species where minimal genetic information is currently available. With this, the University can now be a service laboratory for routine testing and authentication of durian clones.

Keywords: DNA, SSR Analysis, genotype, genetic diversity, cultivars.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3393
379 Evaluation of Green Roof System for Green Building Projects in Malaysia

Authors: Muhammad Ashraf Fauzi, Nurhayati Abdul Malek, Jamilah Othman

Abstract:

The implementations of green roof have been widely used in the developed countries such as Germany, United Kingdom, United States and Canada. Green roof have many benefits such as aesthetic and economic value, ecological gain which are optimization of storm water management, urban heat island mitigation and energy conservation. In term of pollution, green roof can control the air and noise pollution in urban cities. The application of green roof in Malaysian building has been studied with the previous work of green roof either in Malaysia or other Asian region as like Indonesia, Singapore, Thailand, Taiwan and several other countries that have similar climate and environment as in Malaysia. These technologies of adapting green roof have been compared to the Green Building Index (GBI) of Malaysian buildings. The study has concentrated on the technical aspect of green roof system having focused on i) waste & recyclable materials ii) types of plants and method of planting and iii) green roof as tool to reduce storm water runoff. The finding of these areas will be compared to the suitability in achieving good practice of the GBI in Malaysia. Results show that most of the method are based on the countries own climate and environment. This suggests that the method of using green roof must adhere to the tropical climate of Malaysia. Suggestion of this research will be viewed in term of the sustainability of the green roof. Further research can be developed to implement the best method and application in Malaysian climate especially in urban cities and township.

Keywords: Green roofs, vegetation, plants, material, stormwater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5389
378 The Investigation of Enzymatic Activity in the Soils under the Impact of Metallurgical Industrial Activity in Lori Marz, Armenia

Authors: T. H. Derdzyan, K. A. Ghazaryan, G. A. Gevorgyan

Abstract:

Beta-glucosidase, chitinase, leucine-aminopeptidase, acid phosphomonoesterase and acetate-esterase enzyme activities in the soils under the impact of metallurgical industrial activity in Lori marz (district) were investigated. The results of the study showed that the activities of the investigated enzymes in the soils decreased with increasing distance from the Shamlugh copper mine, the Chochkan tailings storage facility and the ore transportation road. Statistical analysis revealed that the activities of the enzymes were positively correlated (significant) to each other according to the observation sites which indicated that enzyme activities were affected by the same anthropogenic factor. The investigations showed that the soils were polluted with heavy metals (Cu, Pb, As, Co, Ni, Zn) due to copper mining activity in this territory. The results of Pearson correlation analysis revealed a significant negative correlation between heavy metal pollution degree (Nemerow integrated pollution index) and soil enzyme activity. All of this indicated that copper mining activity in this territory causing the heavy metal pollution of the soils resulted in the inhabitation of the activities of the enzymes which are considered as biological catalysts to decompose organic materials and facilitate the cycling of nutrients.

Keywords: Armenia, metallurgical industrial activity, heavy metal pollutionl, soil enzyme activity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2560
377 Absorption of Volatile Organic Compounds into Polydimethylsiloxane: Phase Equilibrium Computation at Infinite Dilution

Authors: Edison Muzenda, Corina M Mateescu

Abstract:

Group contribution methods such as the UNIFAC are very useful to researchers and engineers involved in synthesis, feasibility studies, design and optimization of separation processes. They can be applied successfully to predict phase equilibrium and excess properties in the development of chemical and separation processes. The main focus of this work was to investigate the possibility of absorbing selected volatile organic compounds (VOCs) into polydimethylsiloxane (PDMS) using three selected UNIFAC group contribution methods. Absorption followed by subsequent stripping is the predominant available abatement technology of VOCs from flue gases prior to their release into the atmosphere. The original, modified and effective UNIFAC models were used in this work. The thirteen selected VOCs that have been considered in this research are: pentane, hexane, heptanes, trimethylamine, toluene, xylene, cyclohexane, butyl acetate, diethyl acetate, chloroform, acetone, ethyl methyl ketone and isobutyl methyl ketone. The computation was done for solute VOC concentration of 8.55x10-8 which is well in the infinite dilution region. The results obtained in this study compare very well with those published in literature obtained through both measurements and predictions. The phase equilibrium obtained in this study show that PDMS is a good absorbent for the removal of VOCs from contaminated air streams through physical absorption.

Keywords: Absorption, Computation, Feasibility studies, Infinite dilution, Volatile organic compounds

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1945
376 Alternative Methods to Rank the Impact of Object Oriented Metrics in Fault Prediction Modeling using Neural Networks

Authors: Kamaldeep Kaur, Arvinder Kaur, Ruchika Malhotra

Abstract:

The aim of this paper is to rank the impact of Object Oriented(OO) metrics in fault prediction modeling using Artificial Neural Networks(ANNs). Past studies on empirical validation of object oriented metrics as fault predictors using ANNs have focused on the predictive quality of neural networks versus standard statistical techniques. In this empirical study we turn our attention to the capability of ANNs in ranking the impact of these explanatory metrics on fault proneness. In ANNs data analysis approach, there is no clear method of ranking the impact of individual metrics. Five ANN based techniques are studied which rank object oriented metrics in predicting fault proneness of classes. These techniques are i) overall connection weights method ii) Garson-s method iii) The partial derivatives methods iv) The Input Perturb method v) the classical stepwise methods. We develop and evaluate different prediction models based on the ranking of the metrics by the individual techniques. The models based on overall connection weights and partial derivatives methods have been found to be most accurate.

Keywords: Artificial Neural Networks (ANNS), Backpropagation, Fault Prediction Modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
375 A Practice of Zero Trust Architecture in Financial Transactions

Authors: L. Wang, Y. Chen, T. Wu, S. Hu

Abstract:

In order to enhance the security of critical financial infrastructure, this study carries out a transformation of the architecture of a financial trading terminal to a zero trust architecture (ZTA), constructs an active defense system for the cybersecurity, improves the security level of trading services in the Internet environment, enhances the ability to prevent network attacks and unknown risks, and reduces the industry and security risks brought about by cybersecurity risks. This study introduces Software Defined Perimeter (SDP) technology of ZTA, adapts and applies it to a financial trading terminal to achieve security optimization and fine-grained business grading control. The upgraded architecture of the trading terminal moves security protection forward to the user access layer, replaces VPN to optimize remote access and significantly improves the security protection capability of Internet transactions. The study achieves: 1. deep integration with the access control architecture of the transaction system; 2. no impact on the performance of terminals and gateways, and no perception of application system upgrades; 3. customized checklist and policy configuration; 4. introduction of industry-leading security technology such as single-packet authorization (SPA) and secondary authentication. This study carries out a successful application of ZTA in the field of financial trading, and provides transformation ideas for other similar systems while improving the security level of financial transaction services in the Internet environment.

Keywords: Zero trust, trading terminal, architecture, network security, cybersecurity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 203
374 Studies of Rule Induction by STRIM from the Decision Table with Contaminated Attribute Values from Missing Data and Noise — In the Case of Critical Dataset Size —

Authors: Tetsuro Saeki, Yuichi Kato, Shoutarou Mizuno

Abstract:

STRIM (Statistical Test Rule Induction Method) has been proposed as a method to effectively induct if-then rules from the decision table which is considered as a sample set obtained from the population of interest. Its usefulness has been confirmed by simulation experiments specifying rules in advance, and by comparison with conventional methods. However, scope for future development remains before STRIM can be applied to the analysis of real-world data sets. The first requirement is to determine the size of the dataset needed for inducting true rules, since finding statistically significant rules is the core of the method. The second is to examine the capacity of rule induction from datasets with contaminated attribute values created by missing data and noise, since real-world datasets usually contain such contaminated data. This paper examines the first problem theoretically, in connection with the rule length. The second problem is then examined in a simulation experiment, utilizing the critical size of dataset derived from the first step. The experimental results show that STRIM is highly robust in the analysis of datasets with contaminated attribute values, and hence is applicable to real-world data

Keywords: Rule induction, decision table, missing data, noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
373 Formulation and Technology of the Composition of Essential Oils as a Feed Additive in Poultry with Antibacterial Action

Authors: S. Barbaqadze, M. Goderdzishvili, E. Mosidze, L. Lomtadze, V. Mshvildadze, L. Bakuridze, D. Berashvili, A. Bakuridze

Abstract:

This paper focuses on the formulation of phytobiotic designated for further implantation in poultry farming. Composition was meant to be water-soluble powder containing antibacterial essential oils. The development process involved Thyme, Monarda and Clary sage essential oils. The antimicrobial activity of essential oils composite was meant to be tested against gram-negative and gram-positive bacterial strains. The results are processed using the statistical program Sigma STAT. To make essential oils composition water soluble surfactants were added to them. At the first stage of the study, nine options for the optimal composition of essential oils and surfactants were developed. The effect of the amount of surfactants on the essential oils composition solubility in water has been investigated. On the basis of biopharmaceutical studies, the formulation of phytobiotic has been determined: Thyme, monarda and clary sage essential oils 2:1:1 - 100 parts; Licorice extract 5.25 parts and inhalation lactose 300 parts. A technology for the preparation of phytobiotic has been developed and a technological scheme for the preparation of phytobiotic has been made up. The research was performed within the framework of the grant project CARYS-19-363 funded be the Shota Rustaveli National Science Foundation of Georgia.

Keywords: Clary, essential oils, monarda, phytobiotics, poultry, thyme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 487
372 Featured based Segmentation of Color Textured Images using GLCM and Markov Random Field Model

Authors: Dipti Patra, Mridula J

Abstract:

In this paper, we propose a new image segmentation approach for colour textured images. The proposed method for image segmentation consists of two stages. In the first stage, textural features using gray level co-occurrence matrix(GLCM) are computed for regions of interest (ROI) considered for each class. ROI acts as ground truth for the classes. Ohta model (I1, I2, I3) is the colour model used for segmentation. Statistical mean feature at certain inter pixel distance (IPD) of I2 component was considered to be the optimized textural feature for further segmentation. In the second stage, the feature matrix obtained is assumed to be the degraded version of the image labels and modeled as Markov Random Field (MRF) model to model the unknown image labels. The labels are estimated through maximum a posteriori (MAP) estimation criterion using ICM algorithm. The performance of the proposed approach is compared with that of the existing schemes, JSEG and another scheme which uses GLCM and MRF in RGB colour space. The proposed method is found to be outperforming the existing ones in terms of segmentation accuracy with acceptable rate of convergence. The results are validated with synthetic and real textured images.

Keywords: Texture Image Segmentation, Gray Level Cooccurrence Matrix, Markov Random Field Model, Ohta colour space, ICM algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2162
371 Assessment of Health and Safety Item on Construction Sites in Ondo State

Authors: Ikumapayi Catherine Mayowa

Abstract:

The well been of human beings on construction site is very important, many man power had been lost through accidents which kills or make workers physically unfit to carry out construction activities, these in turn have multiple effects on the whole economy. Thus it is necessary to put all safety items and regulations in place before construction activities can commence. This study was carried out in Ondo state of Nigeria to known and analyse the state of health and safety of construction workers in the state. The study was done using first hand observation method, 50 construction project sites were visited in 10 major towns of Ondo state, questionnaires were distributed and the results were analysed. The result show that construction workers are being exposed to a lot of construction site hazards due to lack of inadequate safety programmes and nonprovision of appropriate safety materials for workers on site. From the data gotten for each site visited and the statistical analysis, it can be concluded that occurrence of accident on construction sites depends significantly on the available safety facilities on the sites. The result of the regression statistics show that the level of significant of the dependence of occurrence of accident on the availability of safety items on site is 0.0362 which is less than 0.05 maximum significant level required. Therefore a vital way of sustaining our building strategy is by given a detail attention to provision of adequate health and safety items on construction sites which will reduce the occurrence of accident, loss of man power and death of skilled workers among others.

Keywords: Construction sites, health, safety, welfare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
370 A Nodal Transmission Pricing Model based on Newly Developed Expressions of Real and Reactive Power Marginal Prices in Competitive Electricity Markets

Authors: Ashish Saini, A.K. Saxena

Abstract:

In competitive electricity markets all over the world, an adoption of suitable transmission pricing model is a problem as transmission segment still operates as a monopoly. Transmission pricing is an important tool to promote investment for various transmission services in order to provide economic, secure and reliable electricity to bulk and retail customers. The nodal pricing based on SRMC (Short Run Marginal Cost) is found extremely useful by researchers for sending correct economic signals. The marginal prices must be determined as a part of solution to optimization problem i.e. to maximize the social welfare. The need to maximize the social welfare subject to number of system operational constraints is a major challenge from computation and societal point of views. The purpose of this paper is to present a nodal transmission pricing model based on SRMC by developing new mathematical expressions of real and reactive power marginal prices using GA-Fuzzy based optimal power flow framework. The impacts of selecting different social welfare functions on power marginal prices are analyzed and verified with results reported in literature. Network revenues for two different power systems are determined using expressions derived for real and reactive power marginal prices in this paper.

Keywords: Deregulation, electricity markets, nodal pricing, social welfare function, short run marginal cost.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
369 Optimization the Conditions of Electrophoretic Deposition Fabrication of Graphene-Based Electrode to Consider Applications in Electro-Optical Sensors

Authors: Sepehr Lajevardi Esfahani, Shohre Rouhani, Zahra Ranjbar

Abstract:

Graphene has gained much attention owing to its unique optical and electrical properties. Charge carriers in graphene sheets (GS) carry out a linear dispersion relation near the Fermi energy and behave as massless Dirac fermions resulting in unusual attributes such as the quantum Hall effect and ambipolar electric field effect. It also exhibits nondispersive transport characteristics with an extremely high electron mobility (15000 cm2/(Vs)) at room temperature. Recently, several progresses have been achieved in the fabrication of single- or multilayer GS for functional device applications in the fields of optoelectronic such as field-effect transistors ultrasensitive sensors and organic photovoltaic cells. In addition to device applications, graphene also can serve as reinforcement to enhance mechanical, thermal, or electrical properties of composite materials. Electrophoretic deposition (EPD) is an attractive method for development of various coatings and films. It readily applied to any powdered solid that forms a stable suspension. The deposition parameters were controlled in various thicknesses. In this study, the graphene electrodeposition conditions were optimized. The results were obtained from SEM, Ohm resistance measuring technique and AFM characteristic tests. The minimum sheet resistance of electrodeposited reduced graphene oxide layers is achieved at conditions of 2 V in 10 s and it is annealed at 200 °C for 1 minute.

Keywords: Electrophoretic deposition, graphene oxide, electrical conductivity, electro-optical devices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 962
368 Modelling Dengue Fever (DF) and Dengue Haemorrhagic Fever (DHF) Outbreak Using Poisson and Negative Binomial Model

Authors: W. Y. Wan Fairos, W. H. Wan Azaki, L. Mohamad Alias, Y. Bee Wah

Abstract:

Dengue fever has become a major concern for health authorities all over the world particularly in the tropical countries. These countries, in particular are experiencing the most worrying outbreak of dengue fever (DF) and dengue haemorrhagic fever (DHF). The DF and DHF epidemics, thus, have become the main causes of hospital admissions and deaths in Malaysia. This paper, therefore, attempts to examine the environmental factors that may influence the recent dengue outbreak. The aim of this study is twofold, firstly is to establish a statistical model to describe the relationship between the number of dengue cases and a range of explanatory variables and secondly, to identify the lag operator for explanatory variables which affect the dengue incidence the most. The explanatory variables involved include the level of cloud cover, percentage of relative humidity, amount of rainfall, maximum temperature, minimum temperature and wind speed. The Poisson and Negative Binomial regression analyses were used in this study. The results of the analyses on the 915 observations (daily data taken from July 2006 to Dec 2008), reveal that the climatic factors comprising of daily temperature and wind speed were found to significantly influence the incidence of dengue fever after 2 and 3 weeks of their occurrences. The effect of humidity, on the other hand, appears to be significant only after 2 weeks.

Keywords: Dengue Fever, Dengue Hemorrhagic Fever, Negative Binomial Regression model, Poisson Regression model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2804