Search results for: pinch point analysis.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9891

Search results for: pinch point analysis.

8541 The Correlation between Peer Aggression and Peer Victimization: Are Aggressors Victims Too?

Authors: Glenn M. Calaguas

Abstract:

To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.

Keywords: correlation, peer aggression, peer victimization, sixth-graders.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2425
8540 Enhanced Multi-Intensity Analysis in Multi-Scenery Classification-Based Macro and Micro Elements

Authors: R. Bremananth

Abstract:

Several computationally challenging issues are encountered while classifying complex natural scenes. In this paper, we address the problems that are encountered in rotation invariance with multi-intensity analysis for multi-scene overlapping. In the present literature, various algorithms proposed techniques for multi-intensity analysis, but there are several restrictions in these algorithms while deploying them in multi-scene overlapping classifications. In order to resolve the problem of multi-scenery overlapping classifications, we present a framework that is based on macro and micro basis functions. This algorithm conquers the minimum classification false alarm while pigeonholing multi-scene overlapping. Furthermore, a quadrangle multi-intensity decay is invoked. Several parameters are utilized to analyze invariance for multi-scenery classifications such as rotation, classification, correlation, contrast, homogeneity, and energy. Benchmark datasets were collected for complex natural scenes and experimented for the framework. The results depict that the framework achieves a significant improvement on gray-level matrix of co-occurrence features for overlapping in diverse degree of orientations while pigeonholing multi-scene overlapping.

Keywords: Automatic classification, contrast, homogeneity, invariant analysis, multi-scene analysis, overlapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1095
8539 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: Mortality map, spatial patterns, statistical area, variation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 974
8538 High Capacity Data Hiding based on Predictor and Histogram Modification

Authors: Hui-Yu Huang, Shih-Hsu Chang

Abstract:

In this paper, we propose a high capacity image hiding technology based on pixel prediction and the difference of modified histogram. This approach is used the pixel prediction and the difference of modified histogram to calculate the best embedding point. This approach can improve the predictive accuracy and increase the pixel difference to advance the hiding capacity. We also use the histogram modification to prevent the overflow and underflow. Experimental results demonstrate that our proposed method within the same average hiding capacity can still keep high quality of image and low distortion

Keywords: data hiding, predictor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
8537 PIL Theory

Authors: A. Peveri

Abstract:

The curvature space-time by the presence of material, this deformation must present a pattern of deformation, not random. Space is uniform, elastic and any modification that occurs in one part, causes a change in another.

This deformation exists, must be a constant value and is independent of the observer, and relates the amount of matter, the force caused by the curvature of space and surface space. This unit of space is defined in this study as PIL and represents a constant area of space, deformable in the direction and sense of the center of mass of the body. The PIL is curved and connected to the center of mass of the Earth, to get to that point, through all matter, thus forming part of any place between particles at atomic and subatomic levels. At these levels the space between each particle is flat, unlike the macro where the space curves.

Keywords: Space flat, Space curved, Unit of space, Deformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489
8536 Kinetic Parameter Estimation from Thermogravimetry and Microscale Combustion Calorimetry

Authors: Rhoda Afriyie Mensah, Lin Jiang, Solomon Asante-Okyere, Xu Qiang, Cong Jin

Abstract:

Flammability analysis of extruded polystyrene (XPS) has become crucial due to its utilization as insulation material for energy efficient buildings. Using the Kissinger-Akahira-Sunose and Flynn-Wall-Ozawa methods, the degradation kinetics of two pure XPS from the local market, red and grey ones, were obtained from the results of thermogravity analysis (TG) and microscale combustion calorimetry (MCC) experiments performed under the same heating rates. From the experiments, it was discovered that red XPS released more heat than grey XPS and both materials showed two mass loss stages. Consequently, the kinetic parameters for red XPS were higher than grey XPS. A comparative evaluation of activation energies from MCC and TG showed an insignificant degree of deviation signifying an equivalent apparent activation energy from both methods. However, different activation energy profiles as a result of the different chemical pathways were presented when the dependencies of the activation energies on extent of conversion for TG and MCC were compared.

Keywords: Flammability, microscale combustion calorimetry, thermogravity analysis, thermal degradation, kinetic analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 836
8535 Comparison of Power Generation Status of Photovoltaic Systems under Different Weather Conditions

Authors: Zhaojun Wang, Zongdi Sun, Qinqin Cui, Xingwan Ren

Abstract:

Based on multivariate statistical analysis theory, this paper uses the principal component analysis method, Mahalanobis distance analysis method and fitting method to establish the photovoltaic health model to evaluate the health of photovoltaic panels. First of all, according to weather conditions, the photovoltaic panel variable data are classified into five categories: sunny, cloudy, rainy, foggy, overcast. The health of photovoltaic panels in these five types of weather is studied. Secondly, a scatterplot of the relationship between the amount of electricity produced by each kind of weather and other variables was plotted. It was found that the amount of electricity generated by photovoltaic panels has a significant nonlinear relationship with time. The fitting method was used to fit the relationship between the amount of weather generated and the time, and the nonlinear equation was obtained. Then, using the principal component analysis method to analyze the independent variables under five kinds of weather conditions, according to the Kaiser-Meyer-Olkin test, it was found that three types of weather such as overcast, foggy, and sunny meet the conditions for factor analysis, while cloudy and rainy weather do not satisfy the conditions for factor analysis. Therefore, through the principal component analysis method, the main components of overcast weather are temperature, AQI, and pm2.5. The main component of foggy weather is temperature, and the main components of sunny weather are temperature, AQI, and pm2.5. Cloudy and rainy weather require analysis of all of their variables, namely temperature, AQI, pm2.5, solar radiation intensity and time. Finally, taking the variable values in sunny weather as observed values, taking the main components of cloudy, foggy, overcast and rainy weather as sample data, the Mahalanobis distances between observed value and these sample values are obtained. A comparative analysis was carried out to compare the degree of deviation of the Mahalanobis distance to determine the health of the photovoltaic panels under different weather conditions. It was found that the weather conditions in which the Mahalanobis distance fluctuations ranged from small to large were: foggy, cloudy, overcast and rainy.

Keywords: Fitting, principal component analysis, Mahalanobis distance, SPSS, MATLAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645
8534 Analysis of Diverse Clustering Tools in Data Mining

Authors: S. Sarumathi, N. Shanthi, M. Sharmila

Abstract:

Clustering in data mining is an unsupervised learning technique of aggregating the data objects into meaningful groups such that the intra cluster similarity of objects are maximized and inter cluster similarity of objects are minimized. Over the past decades several clustering tools were emerged in which clustering algorithms are inbuilt and are easier to use and extract the expected results. Data mining mainly deals with the huge databases that inflicts on cluster analysis and additional rigorous computational constraints. These challenges pave the way for the emergence of powerful expansive data mining clustering softwares. In this survey, a variety of clustering tools used in data mining are elucidated along with the pros and cons of each software.

Keywords: Cluster Analysis, Clustering Algorithms, Clustering Techniques, Association, Visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2182
8533 Selection the Optimum Cooling Scheme for Generators based on the Electro-Thermal Analysis

Authors: Diako Azizi, Ahmad Gholami, Vahid Abbasi

Abstract:

Optimal selection of electrical insulations in electrical machinery insures reliability during operation. From the insulation studies of view for electrical machines, stator is the most important part. This fact reveals the requirement for inspection of the electrical machine insulation along with the electro-thermal stresses. In the first step of the study, a part of the whole structure of machine in which covers the general characteristics of the machine is chosen, then based on the electromagnetic analysis (finite element method), the machine operation is simulated. In the simulation results, the temperature distribution of the total structure is presented simultaneously by using electro-thermal analysis. The results of electro-thermal analysis can be used for designing an optimal cooling system. In order to design, review and comparing the cooling systems, four wiring structures in the slots of Stator are presented. The structures are compared to each other in terms of electrical, thermal distribution and remaining life of insulation by using Finite Element analysis. According to the steps of the study, an optimization algorithm has been presented for selection of appropriate structure.

Keywords: Electrical field, field distribution, insulation, winding, finite element method, electro thermal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
8532 Condition Monitoring for Controlling the Stability of the Rotating Machinery

Authors: A. Chellil, I. Gahlouz, S. Lecheb, A. Nour, S. Chellil, H. Mechakra, H. Kebir

Abstract:

In this paper, the experimental study for the instability of a separator rotor is presented, under dynamic loading response in the harmonic analysis condition. The global measurement and analysis of vibration on the cement separator RC500 is carried, the points of measurement used are radial dots, vertical, horizontal and oblique. The measures of trends and spectral analysis for reconnaissance of the main anomalies, the main defects in the separator and manifestation, the results prove that the defects effect has a negative effect on the stability of the rotor. Experimentally the study of the rotor in transient system allowed to determine the vibratory responses due to the unbalances and various excitations.

Keywords: Rotor, experimental, defect, frequency, specter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737
8531 Water Quality Trading with Equitable Total Maximum Daily Loads

Authors: S. Jamshidi, E. Feizi Ashtiani, M. Ardestani

Abstract:

Waste Load Allocation (WLA) strategies usually intend to find economic policies for water resource management. Water quality trading (WQT) is an approach that uses discharge permit market to reduce total environmental protection costs. This primarily requires assigning discharge limits known as total maximum daily loads (TMDLs). These are determined by monitoring organizations with respect to the receiving water quality and remediation capabilities. The purpose of this study is to compare two approaches of TMDL assignment for WQT policy in small catchment area of Haraz River, in north of Iran. At first, TMDLs are assigned uniformly for the whole point sources to keep the concentrations of BOD and dissolved oxygen (DO) at the standard level at checkpoint (terminus point). This was simply simulated and controlled by Qual2kw software. In the second scenario, TMDLs are assigned using multi objective particle swarm optimization (MOPSO) method in which the environmental violation at river basin and total treatment costs are minimized simultaneously. In both scenarios, the equity index and the WLA based on trading discharge permits (TDP) are calculated. The comparative results showed that using economically optimized TMDLs (2nd scenario) has slightly more cost savings rather than uniform TMDL approach (1st scenario). The former annually costs about 1 M$ while the latter is 1.15 M$. WQT can decrease these annual costs to 0.9 and 1.1 M$, respectively. In other word, these approaches may save 35 and 45% economically in comparison with command and control policy. It means that using multi objective decision support systems (DSS) may find more economical WLA, however its outcome is not necessarily significant in comparison with uniform TMDLs. This may be due to the similar impact factors of dischargers in small catchments. Conversely, using uniform TMDLs for WQT brings more equity that makes stakeholders not feel that much envious of difference between TMDL and WQT allocation. In addition, for this case, determination of TMDLs uniformly would be much easier for monitoring. Consequently, uniform TMDL for TDP market is recommended as a sustainable approach. However, economical TMDLs can be used for larger watersheds.

Keywords: Waste load allocation (WLA), Water quality trading (WQT), Total maximum daily loads (TMDLs), Haraz River, Multi objective particle swarm optimization (MOPSO), Equity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042
8530 The Estimation of Human Vital Signs Complexity

Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius

Abstract:

Nonstationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based on the interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore, we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables’ interactions.

Keywords: Cardiac diseases, Complex systems theory, ECG analysis, matrix analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2220
8529 Iteration Acceleration for Nonlinear Coupled Parabolic-Hyperbolic System

Authors: Xia Cui, Guang-wei Yuan, Jing-yan Yue

Abstract:

A Picard-Newton iteration method is studied to accelerate the numerical solution procedure of a class of two-dimensional nonlinear coupled parabolic-hyperbolic system. The Picard-Newton iteration is designed by adding higher-order terms of small quantity to an existing Picard iteration. The discrete functional analysis and inductive hypothesis reasoning techniques are used to overcome difficulties coming from nonlinearity and coupling, and theoretical analysis is made for the convergence and approximation properties of the iteration scheme. The Picard-Newton iteration has a quadratic convergent ratio, and its solution has second order spatial approximation and first order temporal approximation to the exact solution of the original problem. Numerical tests verify the results of the theoretical analysis, and show the Picard-Newton iteration is more efficient than the Picard iteration.

Keywords: Nonlinearity, iterative acceleration, coupled parabolic hyperbolic system, quadratic convergence, numerical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
8528 Modeling a Multinomial Logit Model of Intercity Travel Mode Choice Behavior for All Trips in Libya

Authors: Manssour A. Abdulsalam Bin Miskeen, Ahmed Mohamed Alhodairi, Riza Atiq Abdullah Bin O. K. Rahmat

Abstract:

In the planning point of view, it is essential to have mode choice, due to the massive amount of incurred in transportation systems. The intercity travellers in Libya have distinct features, as against travellers from other countries, which includes cultural and socioeconomic factors. Consequently, the goal of this study is to recognize the behavior of intercity travel using disaggregate models, for projecting the demand of nation-level intercity travel in Libya. Multinomial Logit Model for all the intercity trips has been formulated to examine the national-level intercity transportation in Libya. The Multinomial logit model was calibrated using nationwide revealed preferences (RP) and stated preferences (SP) survey. The model was developed for deference purpose of intercity trips (work, social and recreational). The variables of the model have been predicted based on maximum likelihood method. The data needed for model development were obtained from all major intercity corridors in Libya. The final sample size consisted of 1300 interviews. About two-thirds of these data were used for model calibration, and the remaining parts were used for model validation. This study, which is the first of its kind in Libya, investigates the intercity traveler’s mode-choice behavior. The intercity travel mode-choice model was successfully calibrated and validated. The outcomes indicate that, the overall model is effective and yields higher precision of estimation. The proposed model is beneficial, due to the fact that, it is receptive to a lot of variables, and can be employed to determine the impact of modifications in the numerous characteristics on the need for various travel modes. Estimations of the model might also be of valuable to planners, who can estimate possibilities for various modes and determine the impact of unique policy modifications on the need for intercity travel.

Keywords: Multinomial logit model, improved intercity transport, intercity mode-choice behavior, disaggregate analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7831
8527 Analysis of Knowledge Management Trend by Bibliometric Approach

Authors: Hsu-Hao Tsai, Jiann-Min Yang

Abstract:

The analysis is mainly concentrating on the knowledge management literatures productivity trend which subjects as “knowledge management" in SSCI database. The purpose what the analysis will propose is to summarize the trend information for knowledge management researchers since core knowledge will be concentrated in core categories. The result indicated that the literature productivity which topic as “knowledge management" is still increasing extremely and will demonstrate the trend by different categories including author, country/territory, institution name, document type, language, publication year, and subject area. Focus on the right categories, you will catch the core research information. This implies that the phenomenon "success breeds success" is more common in higher quality publications.

Keywords: Knowledge Management, SSCI, Bibliometric, Lotka's Law

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1218
8526 Ranking Alternatives in Multi-Criteria Decision Analysis using Common Weights Based on Ideal and Anti-ideal Frontiers

Authors: Saber Saati Mohtadi, Ali Payan, Azizallah Kord

Abstract:

One of the most important issues in multi-criteria decision analysis (MCDA) is to determine the weights of criteria so that all alternatives can be compared based on the collective performance of criteria. In this paper, one of popular methods in data envelopment analysis (DEA) known as common weights (CWs) is used to determine the weights in MCDA. Two frontiers named ideal and anti-ideal frontiers, instead of ideal and anti-ideal alternatives, are defined based on two new proposed CWs models. Ideal and antiideal frontiers are more flexible than that of alternatives. According to the optimal solutions of these two models, the distances of an alternative from the ideal and anti-ideal frontiers are derived. Then, a relative distance is introduced to measure the value of each alternative. The suggested models are linear and despite weight restrictions are feasible. An example is presented for explaining the method and for comparing to the existing literature.

Keywords: Anti-ideal frontier, Common weights (CWs), Ideal frontier, Multi-criteria decision analysis (MCDA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1872
8525 1/Sigma Term Weighting Scheme for Sentiment Analysis

Authors: Hanan Alshaher, Jinsheng Xu

Abstract:

Large amounts of data on the web can provide valuable information. For example, product reviews help business owners measure customer satisfaction. Sentiment analysis classifies texts into two polarities: positive and negative. This paper examines movie reviews and tweets using a new term weighting scheme, called one-over-sigma (1/sigma), on benchmark datasets for sentiment classification. The proposed method aims to improve the performance of sentiment classification. The results show that 1/sigma is more accurate than the popular term weighting schemes. In order to verify if the entropy reflects the discriminating power of terms, we report a comparison of entropy values for different term weighting schemes.

Keywords: Sentiment analysis, term weighting scheme, 1/sigma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 488
8524 Runoff Quality and Pollution Loading from a Residential Catchment in Miri, Sarawak

Authors: Carrie Ho, Choo Bo Quan

Abstract:

Urban non-point source (NPS) pollution for a residential catchment in Miri, Sarawak was investigated for two storm events in 2011. Runoff from two storm events were sampled and tested for water quality parameters including TSS, BOD5, COD, NH3-N, NO3-N, NO2-N, P and Pb. Concentration of the water quality parameters was found to vary significantly between storms and the pollutant of concern was found to be NO3-N, TSS, COD and Pb. Results were compared to the Interim National Water Quality Standards for Malaysia (INWQS),and the stormwater runoff from the study can be classified as polluted, exceeding class III water quality, especially in terms of TSS, COD, and NH3-N with maximum EMCs of 158, 135, and 2.17 mg/L, respectively.

Keywords: Residential land-use, urban runoff, water quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2468
8523 Altered States of Consciousness in Narrative Cinema: Subjective Film Sound

Authors: Mladen Milicevic

Abstract:

In this paper subjective film sound will be addressed as it gets represented in narrative cinema. First, “meta-diegetic” sound will be briefly explained followed by transition to “oneiric” sound. The representation of oneiric sound refers to a situation where film characters are experiencing some sort of an altered state of consciousness. Looking at an antlered state of consciousness in terms of human brain processes will point out to the cinematic ways of expression, which “mimic” those processes. Using several examples for different films will illustrate these points.

Keywords: Oneiric, ASC (altered states of consciousness), film, sound.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2610
8522 Breast Cancer Treatment Evaluation based on Mammographic and Echographic Distance Computing

Authors: M. Caramihai, Irina Severin, H. Balan, A. Blidaru, V. Balanica

Abstract:

Accurate assessment of the primary tumor response to treatment is important in the management of breast cancer. This paper introduces a new set of treatment evaluation indicators for breast cancer cases based on the computational process of three known metrics, the Euclidian, Hamming and Levenshtein distances. The distance principals are applied to pairs of mammograms and/or echograms, recorded before and after treatment, determining a reference point in judging the evolution amount of the studied carcinoma. The obtained numerical results are indeed very transparent and indicate not only the evolution or the involution of the tumor under treatment, but also a quantitative measurement of the benefit in using the selected method of treatment.

Keywords: Breast cancer, Distance metrics, Cancer treatment evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540
8521 Error Analysis of English Inflection among Thai University Students

Authors: Suwaree Yordchim, Toby J. Gibbs

Abstract:

The linguistic competence of Thai university students majoring in Business English was examined in the context of knowledge of English language inflection, and also various linguistic elements. Errors analysis was applied to the results of the testing. Levels of errors in inflection, tense and linguistic elements were shown to be significantly high for all noun, verb and adjective inflections. Findings suggest that students do not gain linguistic competence in their use of English language inflection, because of interlanguage interference. Implications for curriculum reform and treatment of errors in the classroom are discussed.

Keywords: Interlanguage, error analysis, inflection, second language acquisition, Thai students.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3591
8520 A Lean Manufacturing Profile of Practices in the Metallurgical Industry: A Methodology for Multivariate Analysis

Authors: Jonathan D. Morales M., Ramón Silva R.

Abstract:

The purpose of this project is to carry out an analysis and determine the profile of actual lean manufacturing processes in the Metropolitan Area of Bucaramanga. Through the analysis of qualitative and quantitative variables it was possible to establish how these manufacturers develop production practices that ensure their competitiveness and productivity in the market. In this study, a random sample of metallurgic and wrought iron companies was applied, following which a quantitative focus and analysis was used to formulate a qualitative methodology for measuring the level of lean manufacturing procedures in the industry. A qualitative evaluation was also carried out through a multivariate analysis using the Numerical Taxonomy System (NTSYS) program which should allow for the determination of Lean Manufacturing profiles. Through the results it was possible to observe how the companies in the sector are doing with respect to Lean Manufacturing Practices, as well as identify the level of management that these companies practice with respect to this topic. In addition, it was possible to ascertain that there is no one dominant profile in the sector when it comes to Lean Manufacturing. It was established that the companies in the metallurgic and wrought iron industry show low levels of Lean Manufacturing implementation. Each one carries out diverse actions that are insufficient to consolidate a sectoral strategy for developing a competitive advantage which enables them to tie together a production strategy.

Keywords: Lean manufacturing, metallurgic industry, production line management, productivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1851
8519 Aqueous Ranitidine Elimination in Photolytic Processes

Authors: Javier Rivas, Olga Gimeno, Maria Carbajo, Teresa Borralho

Abstract:

The elimination of ranitidine (a pharmaceutical compound) has been carried out in the presence of UV-C radiation. After some preliminary experiments, it has been experienced the no influence of the gas nature (air or oxygen) bubbled in photolytic experiments. From simple photolysis experiments the quantum yield of this compound has been determined. Two photolytic approximation has been used, the linear source emission in parallel planes and the point source emission in spherical planes. The quantum yield obtained was in the proximity of 0.05 mol Einstein-1 regardless of the method used. Addition of free radical promoters (hydrogen peroxide) increases the ranitidine removal rate while the use of photocatalysts (TiO2) negatively affects the process.

Keywords: Quantum yield, photolysis, ranitidine, watertreatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1674
8518 Spatio-Temporal Video Slice Edges Analysis for Shot Transition Detection and Classification

Authors: Aissa Saoudi, Hassane Essafi

Abstract:

In this work we will present a new approach for shot transition auto-detection. Our approach is based on the analysis of Spatio-Temporal Video Slice (STVS) edges extracted from videos. The proposed approach is capable to efficiently detect both abrupt shot transitions 'cuts' and gradual ones such as fade-in, fade-out and dissolve. Compared to other techniques, our method is distinguished by its high level of precision and speed. Those performances are obtained due to minimizing the problem of the boundary shot detection to a simple 2D image partitioning problem.

Keywords: Boundary shot detection, Shot transition detection, Video analysis, Video indexing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611
8517 Keyword Network Analysis on the Research Trends of Life-Long Education for People with Disabilities in Korea

Authors: Jakyoung Kim, Sungwook Jang

Abstract:

The purpose of this study is to examine the research trends of life-long education for people with disabilities using a keyword network analysis. For this purpose, 151 papers were selected from 594 papers retrieved using keywords such as 'people with disabilities' and 'life-long education' in the Korean Education and Research Information Service. The Keyword network analysis was constructed by extracting and coding the keyword used in the title of the selected papers. The frequency of the extracted keywords, the centrality of degree, and betweenness was analyzed by the keyword network. The results of the keyword network analysis are as follows. First, the main keywords that appeared frequently in the study of life-long education for people with disabilities were 'people with disabilities', 'life-long education', 'developmental disabilities', 'current situations', 'development'. The research trends of life-long education for people with disabilities are focused on the current status of the life-long education and the program development. Second, the keyword network analysis and visualization showed that the keywords with high frequency of occurrences also generally have high degree centrality and betweenness centrality. In terms of the keyword network diagram, it was confirmed that research trends of life-long education for people with disabilities are centered on six prominent keywords. Based on these results, it was discussed that life-long education for people with disabilities in the future needs to expand the subjects and the supporting areas of the life-long education, and the research needs to be further expanded into more detailed and specific areas. 

Keywords: Life-long education, people with disabilities, research trends, keyword network analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1223
8516 Vague Multiple Criteria Decision Making Analysis Method for Fighter Aircraft Selection

Authors: C. Ardil

Abstract:

Fighter aircraft selection is one of the most critical strategies for defense multiple criteria decision-making analysis to increase the decisive power of air defense and its superior power in the defense strategy. Vague set theory is an adequate approach for modeling vagueness, uncertainty, and imprecision in decision-making problems. This study integrates vague set theory and the technique for order of preference by similarity to ideal solution (TOPSIS) to support fighter aircraft selection. The proposed method is applied in the selection of fighter aircraft for the Air Force. In the proposed approach, the ratings of alternatives and the importance weights of criteria for fighter aircraft selection are represented by the vague set theory. Finally, an illustrative example for fighter aircraft selection is given to demonstrate the applicability and effectiveness of the proposed approach. The fighter aircraft candidates were selected under six criteria including costability, payloadability, maneuverability, speedability, stealthility, and survivability. Analysis results show that the best fighter aircraft is selected with the highest closeness coefficient value. The proposed method can also be applied to solve other multiple criteria decision analysis problems. 

Keywords: fighter aircraft selection, vague set theory, fuzzy set theory, neutrosophic set theory, multiple criteria decision making analysis, MCDMA, TOPSIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 492
8515 Transmission Lines Loading Enhancement Using ADPSO Approach

Authors: M. Mahdavi, H. Monsef, A. Bagheri

Abstract:

Discrete particle swarm optimization (DPSO) is a powerful stochastic evolutionary algorithm that is used to solve the large-scale, discrete and nonlinear optimization problems. However, it has been observed that standard DPSO algorithm has premature convergence when solving a complex optimization problem like transmission expansion planning (TEP). To resolve this problem an advanced discrete particle swarm optimization (ADPSO) is proposed in this paper. The simulation result shows that optimization of lines loading in transmission expansion planning with ADPSO is better than DPSO from precision view point.

Keywords: ADPSO, TEP problem, Lines loading optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598
8514 A Phenomic Algorithm for Reconstruction of Gene Networks

Authors: Rio G. L. D'Souza, K. Chandra Sekaran, A. Kandasamy

Abstract:

The goal of Gene Expression Analysis is to understand the processes that underlie the regulatory networks and pathways controlling inter-cellular and intra-cellular activities. In recent times microarray datasets are extensively used for this purpose. The scope of such analysis has broadened in recent times towards reconstruction of gene networks and other holistic approaches of Systems Biology. Evolutionary methods are proving to be successful in such problems and a number of such methods have been proposed. However all these methods are based on processing of genotypic information. Towards this end, there is a need to develop evolutionary methods that address phenotypic interactions together with genotypic interactions. We present a novel evolutionary approach, called Phenomic algorithm, wherein the focus is on phenotypic interaction. We use the expression profiles of genes to model the interactions between them at the phenotypic level. We apply this algorithm to the yeast sporulation dataset and show that the algorithm can identify gene networks with relative ease.

Keywords: Evolutionary computing, gene expression analysis, gene networks, microarray data analysis, phenomic algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906
8513 Effect of Delay on Supply Side on Market Behavior: A System Dynamic Approach

Authors: M. Khoshab, M. J. Sedigh

Abstract:

Dynamic systems, which in mathematical point of view are those governed by differential equations, are much more difficult to study and to predict their behavior in comparison with static systems which are governed by algebraic equations. Economical systems such as market are among complicated dynamic systems. This paper tries to adopt a very simple mathematical model for market and to study effect of supply and demand function on behavior of the market while the supply side experiences a lag due to production restrictions.

Keywords: Dynamic System, Lag on Supply Demand, Market Stability, Supply Demand Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1532
8512 Using TRACE and SNAP Codes to Establish the Model of Maanshan PWR for SBO Accident

Authors: B. R. Shen, J. R. Wang, J. H. Yang, S. W. Chen, C. Shih, Y. Chiang, Y. F. Chang, Y. H. Huang

Abstract:

In this research, TRACE code with the interface code-SNAP was used to simulate and analyze the SBO (station blackout) accident which occurred in Maanshan PWR (pressurized water reactor) nuclear power plant (NPP). There are four main steps in this research. First, the SBO accident data of Maanshan NPP were collected. Second, the TRACE/SNAP model of Maanshan NPP was established by using these data. Third, this TRACE/SNAP model was used to perform the simulation and analysis of SBO accident. Finally, the simulation and analysis of SBO with mitigation equipments was performed. The analysis results of TRACE are consistent with the data of Maanshan NPP. The mitigation equipments of Maanshan can maintain the safety of Maanshan in the SBO according to the TRACE predictions.

Keywords: PWR, TRACE, SBO, Maanshan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 731