Search results for: fuzzy logic based analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 46353

Search results for: fuzzy logic based analysis

44763 Analysis of Barbell Kinematics of Snatch Technique among Women Weightlifters in India

Authors: Manish Kumar Pillai, Madhavi Pathak Pillai, Rajender Lal, Dinesh P. Sharma

Abstract:

India has not yet been able to produce many weightlifters in the past years. Karnam Malleshwari is the only woman to win a medal for India in Olympics. When we try to introspect, there seem to be different reasons. One of the probable cause could be the lack of biomechanical analysis for technique improvements. The analysis of motion in sports has gained prime importance for technical improvement. It helps an athlete to develop a better understanding of his own skills and increasing the rate of technical learning process. Kinematics is concerned with describing and quantifying both the linear and angular position of bodies and their time derivatives. The techniques analysis of barbell movement is very important in weightlifting. But women weightlifting has a shorter history than men’s. Research on women weightlifting based on video analysis is less; there is a lack of scientific evidence based on kinematic analysis of especially on Indian weightlifters at national level are limited. Hence, the present investigation was aimed to analyze the barbell kinematics of women weightlifters in India. The study was delimited to the medal winners of 69-kilogram weight category in the All India Inter-University Competition, age ranging between 18 and 28 years. The variables selected for the mechanical analysis of Barbell kinematics included barbell trajectory, velocity, acceleration, potential energy, kinetic energy, mechanical energy, and average power output. The performance was captured during the competition by two DV PC-60 Digital cameras (Panasonic Company, Ltd). Two cameras were placed 6-meters perpendicular to the plane of the motion, 130 cm. above the ground to record/capture the frontal and lateral view of the lifters simultaneously. Video recordings were analyzed by using Dartfish software, and barbell kinematics were analyzed with the information derived with the help of software. The result documented on the basis of the finding of the study clearly states that there are differences in the selected kinematic variables in all three lifters in respect to their technique in five phases during snatch technique using by them.

Keywords: dartfish, digital camera, kinematic, snatch, weightlifting

Procedia PDF Downloads 136
44762 Nonlinear Analysis of Reinforced Concrete Arched Structures Considering Soil-Structure Interaction

Authors: Mohamed M. El Gendy, Ibrahim A. El Arabi, Rafeek W. Abdel-Missih, Omar A. Kandil

Abstract:

Nonlinear analysis is one of the most important design and safety tools in structural engineering. Based on the finite-element method, a geometrical and material nonlinear analysis of large span reinforced concrete arches is carried out considering soil-structure interaction. The concrete section details and reinforcement distribution are taken into account. The behavior of soil is considered via Winkler's and continuum models. A computer program (NARC II) is specially developed in order to follow the structural behavior of large span reinforced concrete arches up to failure. The results obtained by the proposed model are compared with available literature for verification. This work confirmed that the geometrical and material nonlinearities, as well as soil structure interaction, have considerable influence on the structural response of reinforced concrete arches.

Keywords: nonlinear analysis, reinforced concrete arched structure, soil-structure interaction, geotechnical engineering

Procedia PDF Downloads 438
44761 Analisys of Cereal Flours by Fluorescence Spectroscopy and PARAFAC

Authors: Lea Lenhardt, Ivana Zeković, Tatjana Dramićanin, Miroslav D. Dramićanin

Abstract:

Rapid and sensitive analytical technologies for food analysis are needed to respond to the growing public interest in food quality and safety. In this context, fluorescence spectroscopy offers several inherent advantages for the characterization of food products: high sensitivity, low price, objective, relatively fast and non-destructive. The objective of this work was to investigate the potential of fluorescence spectroscopy coupled with multi-way technique for characterization of cereal flours. Fluorescence landscape also known as excitation-emission matrix (EEM) spectroscopy utilizes multiple-color illumination, with the full fluorescence spectrum recorded for each excitation wavelength. EEM was measured on various types of cereal flours (wheat, oat, barley, rye, corn, buckwheat and rice). Obtained spectra were analyzed using PARAllel FACtor analysis (PARAFAC) in order to decompose the spectra and identify underlying fluorescent components. Results of the analysis indicated the presence of four fluorophores in cereal flours. It has been observed that relative concentration of fluorophores varies between different groups of flours. Based on these findings we can conclude that application of PARAFAC analysis on fluorescence data is a good foundation for further qualitative analysis of cereal flours.

Keywords: cereals, fluors, fluorescence, PARAFAC

Procedia PDF Downloads 665
44760 Analysis of Thermal Damping in Si Based Torsional Micromirrors

Authors: R. Resmi, M. R. Baiju

Abstract:

The thermal damping of a dynamic vibrating micromirror is an important factor affecting the design of MEMS based actuator systems. In the development process of new micromirror systems, assessing the extent of energy loss due to thermal damping accurately and predicting the performance of the system is very essential. In this paper, the depth of the thermal penetration layer at different eigenfrequencies and the temperature variation distributions surrounding a vibrating micromirror is analyzed. The thermal penetration depth corresponds to the thermal boundary layer in which energy is lost which is a measure of the thermal damping is found out. The energy is mainly dissipated in the thermal boundary layer and thickness of the layer is an important parameter. The detailed thermoacoustics is used to model the air domain surrounding the micromirror. The thickness of the boundary layer, temperature variations and thermal power dissipation are analyzed for a Si based torsional mode micromirror. It is found that thermal penetration depth decreases with eigenfrequency and hence operating the micromirror at higher frequencies is essential for reducing thermal damping. The temperature variations and thermal power dissipations at different eigenfrequencies are also analyzed. Both frequency-response and eigenfrequency analyses are done using COMSOL Multiphysics software.

Keywords: Eigen frequency analysis, micromirrors, thermal damping, thermoacoustic interactions

Procedia PDF Downloads 366
44759 Investigating the Effectiveness of Multilingual NLP Models for Sentiment Analysis

Authors: Othmane Touri, Sanaa El Filali, El Habib Benlahmar

Abstract:

Natural Language Processing (NLP) has gained significant attention lately. It has proved its ability to analyze and extract insights from unstructured text data in various languages. It is found that one of the most popular NLP applications is sentiment analysis which aims to identify the sentiment expressed in a piece of text, such as positive, negative, or neutral, in multiple languages. While there are several multilingual NLP models available for sentiment analysis, there is a need to investigate their effectiveness in different contexts and applications. In this study, we aim to investigate the effectiveness of different multilingual NLP models for sentiment analysis on a dataset of online product reviews in multiple languages. The performance of several NLP models, including Google Cloud Natural Language API, Microsoft Azure Cognitive Services, Amazon Comprehend, Stanford CoreNLP, spaCy, and Hugging Face Transformers are being compared. The models based on several metrics, including accuracy, precision, recall, and F1 score, are being evaluated and compared to their performance across different categories of product reviews. In order to run the study, preprocessing of the dataset has been performed by cleaning and tokenizing the text data in multiple languages. Then training and testing each model has been applied using a cross-validation approach where randomly dividing the dataset into training and testing sets and repeating the process multiple times has been used. A grid search approach to optimize the hyperparameters of each model and select the best-performing model for each category of product reviews and language has been applied. The findings of this study provide insights into the effectiveness of different multilingual NLP models for Multilingual Sentiment Analysis and their suitability for different languages and applications. The strengths and limitations of each model were identified, and recommendations for selecting the most performant model based on the specific requirements of a project were provided. This study contributes to the advancement of research methods in multilingual NLP and provides a practical guide for researchers and practitioners in the field.

Keywords: NLP, multilingual, sentiment analysis, texts

Procedia PDF Downloads 104
44758 Entropy Analysis of a Thermo-Acoustic Stack

Authors: Ahmadali Shirazytabar, Hamidreza Namazi

Abstract:

The inherent irreversibility of thermo-acoustics primarily in the stack region causes poor efficiency of thermo-acoustic engines which is the major weakness of these devices. In view of the above, this study examines entropy generation in the stack of a thermo-acoustic system. For this purpose two parallel plates representative of the stack is considered. A general equation for entropy generation is derived based on the Second Law of thermodynamics. Assumptions such as Rott’s linear thermo-acoustic approximation, boundary layer type flow, etc. are made to simplify the governing continuity, momentum and energy equations to achieve analytical solutions for velocity and temperature. The entropy generation equation is also simplified based on the same assumptions and then is converted to dimensionless form by using characteristic entropy generation. A time averaged entropy generation rate followed by a global entropy generation rate are calculated and graphically represented for further analysis and inspecting the effect of different parameters on the entropy generation.

Keywords: thermo-acoustics, entropy, second law of thermodynamics, Rott’s linear thermo-acoustic approximation

Procedia PDF Downloads 403
44757 Current-Based Multiple Faults Detection in Electrical Motors

Authors: Moftah BinHasan

Abstract:

Induction motors (IM) are vital components in industrial processes whose failure may yield to an unexpected interruption at the industrial plant, with highly incurred consequences in costs, product quality, and safety. Among different detection approaches proposed in the literature, that based on stator current monitoring termed as Motor Current Signature Analysis (MCSA) is the most preferred. MCSA is advantageous due to its non-invasive properties. The popularity of motor current signature analysis comes from being that the current consists of motor harmonics, around the supply frequency, which show some properties related to different situations of healthy and faulty conditions. One of the techniques used with machine line current resorts to spectrum analysis. Besides discussing the fundamentals of MCSA and its applications in the condition monitoring arena, this paper shows a summary of the most frequent faults and their consequence signatures on the stator current spectrum of an induction motor. In addition, this article presents different case studies of induction motor fault diagnosis. These faults were seeded in the machine which was run for more than an hour for each test before the results were recorded for the faulty situations. These results are then compared with those for the healthy cases that were recorded earlier.

Keywords: induction motor, condition monitoring, fault diagnosis, MCSA, rotor, stator, bearing, eccentricity

Procedia PDF Downloads 459
44756 Fractal Analysis of Polyacrylamide-Graphene Oxide Composite Gels

Authors: Gülşen Akın Evingür, Önder Pekcan

Abstract:

The fractal analysis is a bridge between the microstructure and macroscopic properties of gels. Fractal structure is usually provided to define the complexity of crosslinked molecules. The complexity in gel systems is described by the fractal dimension (Df). In this study, polyacrylamide- graphene oxide (GO) composite gels were prepared by free radical crosslinking copolymerization. The fractal analysis of polyacrylamide- graphene oxide (GO) composite gels were analyzed in various GO contents during gelation and were investigated by using Fluorescence Technique. The analysis was applied to estimate Df s of the composite gels. Fractal dimension of the polymer composite gels were estimated based on the power law exponent values using scaling models. In addition, here we aimed to present the geometrical distribution of GO during gelation. And we observed that as gelation proceeded GO plates first organized themselves into 3D percolation cluster with Df=2.52, then goes to diffusion limited clusters with Df =1.4 and then lines up to Von Koch curve with random interval with Df=1.14. Here, our goal is to try to interpret the low conductivity and/or broad forbidden gap of GO doped PAAm gels, by the distribution of GO in the final form of the produced gel.

Keywords: composite gels, fluorescence, fractal, scaling

Procedia PDF Downloads 307
44755 A Comparative Analysis of E-Government Quality Models

Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri

Abstract:

Many quality models have been used to measure e-government portals quality. However, the absence of an international consensus for e-government portals quality models results in many differences in terms of quality attributes and measures. The aim of this paper is to compare and analyze the existing e-government quality models proposed in literature (those that are based on ISO standards and those that are not) in order to propose guidelines to build a good and useful e-government portals quality model. Our findings show that, there is no e-government portal quality model based on the new international standard ISO 25010. Besides that, the quality models are not based on a best practice model to allow agencies to both; measure e-government portals quality and identify missing best practices for those portals.

Keywords: e-government, portal, best practices, quality model, ISO, standard, ISO 25010, ISO 9126

Procedia PDF Downloads 560
44754 A Quality Index Optimization Method for Non-Invasive Fetal ECG Extraction

Authors: Lucia Billeci, Gennaro Tartarisco, Maurizio Varanini

Abstract:

Fetal cardiac monitoring by fetal electrocardiogram (fECG) can provide significant clinical information about the healthy condition of the fetus. Despite this potentiality till now the use of fECG in clinical practice has been quite limited due to the difficulties in its measuring. The recovery of fECG from the signals acquired non-invasively by using electrodes placed on the maternal abdomen is a challenging task because abdominal signals are a mixture of several components and the fetal one is very weak. This paper presents an approach for fECG extraction from abdominal maternal recordings, which exploits the characteristics of pseudo-periodicity of fetal ECG. It consists of devising a quality index (fQI) for fECG and of finding the linear combinations of preprocessed abdominal signals, which maximize these fQI (quality index optimization - QIO). It aims at improving the performances of the most commonly adopted methods for fECG extraction, usually based on maternal ECG (mECG) estimating and canceling. The procedure for the fECG extraction and fetal QRS (fQRS) detection is completely unsupervised and based on the following steps: signal pre-processing; maternal ECG (mECG) extraction and maternal QRS detection; mECG component approximation and canceling by weighted principal component analysis; fECG extraction by fQI maximization and fetal QRS detection. The proposed method was compared with our previously developed procedure, which obtained the highest at the Physionet/Computing in Cardiology Challenge 2013. That procedure was based on removing the mECG from abdominal signals estimated by a principal component analysis (PCA) and applying the Independent component Analysis (ICA) on the residual signals. Both methods were developed and tuned using 69, 1 min long, abdominal measurements with fetal QRS annotation of the dataset A provided by PhysioNet/Computing in Cardiology Challenge 2013. The QIO-based and the ICA-based methods were compared in analyzing two databases of abdominal maternal ECG available on the Physionet site. The first is the Abdominal and Direct Fetal Electrocardiogram Database (ADdb) which contains the fetal QRS annotations thus allowing a quantitative performance comparison, the second is the Non-Invasive Fetal Electrocardiogram Database (NIdb), which does not contain the fetal QRS annotations so that the comparison between the two methods can be only qualitative. In particular, the comparison on NIdb was performed defining an index of quality for the fetal RR series. On the annotated database ADdb the QIO method, provided the performance indexes Sens=0.9988, PPA=0.9991, F1=0.9989 overcoming the ICA-based one, which provided Sens=0.9966, PPA=0.9972, F1=0.9969. The comparison on NIdb was performed defining an index of quality for the fetal RR series. The index of quality resulted higher for the QIO-based method compared to the ICA-based one in 35 records out 55 cases of the NIdb. The QIO-based method gave very high performances with both the databases. The results of this study foresees the application of the algorithm in a fully unsupervised way for the implementation in wearable devices for self-monitoring of fetal health.

Keywords: fetal electrocardiography, fetal QRS detection, independent component analysis (ICA), optimization, wearable

Procedia PDF Downloads 280
44753 Research on Road Openness in the Old Urban Residential District Based on Space Syntax: A Case Study on Kunming within the First Loop Road

Authors: Haoyang Liang, Dandong Ge

Abstract:

With the rapid development of Chinese cities, traffic congestion has become more and more serious. At the same time, there are many closed old residential area in Chinese cities, which seriously affect the connectivity of urban roads and reduce the density of urban road networks. After reopening the restricted old residential area, the internal roads in the original residential area were transformed into urban roads, which was of great help to alleviate traffic congestion. This paper uses the spatial syntactic theory to analyze the urban road network and compares the roads with the integration and connectivity degree to evaluate whether the opening of the roads in the residential areas can improve the urban traffic. Based on the road network system within the first loop road in Kunming, the Space Syntax evaluation model is established for status analysis. And comparative analysis method will be used to compare the change of the model before and after the road openness of the old urban residential district within the first-ring road in Kunming. Then it will pick out the areas which indicate a significant difference for the small dimensions model analysis. According to the analyzed results and traffic situation, the evaluation of road openness in the old urban residential district will be proposed to improve the urban residential districts.

Keywords: Space Syntax, Kunming, urban renovation, traffic jam

Procedia PDF Downloads 162
44752 State, Public Policies, and Rights: Public Expenditure and Social and Welfare Policies in America, as Opposed to Argentina

Authors: Mauro Cristeche

Abstract:

This paper approaches the intervention of the American State in the social arena and the modeling of the rights system from the Argentinian experience, by observing the characteristics of its federal budgetary system, the evolution of social public spending and welfare programs in recent years, labor and poverty statistics, and the changes on the labor market structure. The analysis seeks to combine different methodologies and sources: in-depth interviews with specialists, analysis of theoretical and mass-media material, and statistical sources. Among the results, it could be mentioned that the tendency to state interventionism (what has been called ‘nationalization of social life’) is quite evident in the United States, and manifests itself in multiple forms. The bibliography consulted, and the experts interviewed pointed out this increase of the state presence in historical terms (beyond short-term setbacks) in terms of increase of public spending, fiscal pressure, public employment, protective and control mechanisms, the extension of welfare policies to the poor sectors, etc. In fact, despite the significant differences between both countries, the United States and Argentina have common patterns of behavior in terms of the aforementioned phenomena. On the other hand, dissimilarities are also important. Some of them are determined by each country's own political history. The influence of political parties on the economic model seems more decisive in the United States than in Argentina, where the tendency to state interventionism is more stable. The centrality of health spending is evident in America, while in Argentina that discussion is more concentrated in the social security system and public education. The biggest problem of the labor market in the United States is the disqualification as a consequence of the technological development while in Argentina it is a result of its weakness. Another big difference is the huge American public spending on Defense. Then, the more federal character of the American State is also a factor of differential analysis against a centralized Argentine state. American public employment (around 10%) is comparatively quite lower than the Argentinian (around 18%). The social statistics show differences, but inequality and poverty have been growing as a trend in the last decades in both countries. According to public rates, poverty represents 14% in The United States and 33% in Argentina. American public spending is important (welfare spending and total public spending represent around 12% and 34% of GDP, respectively), but a bit lower than Latin-American or European average). In both cases, the tendency to underemployment and disqualification unemployment does not assume a serious gravity. Probably one of the most important aspects of the analysis is that private initiative and public intervention are much more intertwined in the United States, which makes state intervention more ‘fuzzy’, while in Argentina the difference is clearer. Finally, the power of its accumulation of capital and, more specifically, of the industrial and services sectors in the United States, which continues to be the engine of the economy, express great differences with Argentina, supported by its agro-industrial power and its public sector.

Keywords: state intervention, welfare policies, labor market, system of rights, United States of America

Procedia PDF Downloads 131
44751 Impact on the Results of Sub-Group Analysis on Performance of Recommender Systems

Authors: Ho Yeon Park, Kyoung-Jae Kim

Abstract:

The purpose of this study is to investigate whether friendship in social media can be an important factor in recommender system through social scientific analysis of friendship in popular social media such as Facebook and Twitter. For this purpose, this study analyzes data on friendship in real social media using component analysis and clique analysis among sub-group analysis in social network analysis. In this study, we propose an algorithm to reflect the results of sub-group analysis on the recommender system. The key to this algorithm is to ensure that recommendations from users in friendships are more likely to be reflected in recommendations from users. As a result of this study, outcomes of various subgroup analyzes were derived, and it was confirmed that the results were different from the results of the existing recommender system. Therefore, it is considered that the results of the subgroup analysis affect the recommendation performance of the system. Future research will attempt to generalize the results of the research through further analysis of various social data.

Keywords: sub-group analysis, social media, social network analysis, recommender systems

Procedia PDF Downloads 364
44750 A Non-Parametric Based Mapping Algorithm for Use in Audio Fingerprinting

Authors: Analise Borg, Paul Micallef

Abstract:

Over the past few years, the online multimedia collection has grown at a fast pace. Several companies showed interest to study the different ways to organize the amount of audio information without the need of human intervention to generate metadata. In the past few years, many applications have emerged on the market which are capable of identifying a piece of music in a short time. Different audio effects and degradation make it much harder to identify the unknown piece. In this paper, an audio fingerprinting system which makes use of a non-parametric based algorithm is presented. Parametric analysis is also performed using Gaussian Mixture Models (GMMs). The feature extraction methods employed are the Mel Spectrum Coefficients and the MPEG-7 basic descriptors. Bin numbers replaced the extracted feature coefficients during the non-parametric modelling. The results show that non-parametric analysis offer potential results as the ones mentioned in the literature.

Keywords: audio fingerprinting, mapping algorithm, Gaussian Mixture Models, MFCC, MPEG-7

Procedia PDF Downloads 421
44749 Research of the Three-Dimensional Visualization Geological Modeling of Mine Based on Surpac

Authors: Honggang Qu, Yong Xu, Rongmei Liu, Zhenji Gao, Bin Wang

Abstract:

Today's mining industry is advancing gradually toward digital and visual direction. The three-dimensional visualization geological modeling of mine is the digital characterization of mineral deposits and is one of the key technology of digital mining. Three-dimensional geological modeling is a technology that combines geological spatial information management, geological interpretation, geological spatial analysis and prediction, geostatistical analysis, entity content analysis and graphic visualization in a three-dimensional environment with computer technology and is used in geological analysis. In this paper, the three-dimensional geological modeling of an iron mine through the use of Surpac is constructed, and the weight difference of the estimation methods between the distance power inverse ratio method and ordinary kriging is studied, and the ore body volume and reserves are simulated and calculated by using these two methods. Compared with the actual mine reserves, its result is relatively accurate, so it provides scientific bases for mine resource assessment, reserve calculation, mining design and so on.

Keywords: three-dimensional geological modeling, geological database, geostatistics, block model

Procedia PDF Downloads 78
44748 Extending the AOP Joinpoint Model for Memory and Type Safety

Authors: Amjad Nusayr

Abstract:

Software security is a general term used to any type of software architecture or model in which security aspects are incorporated in this architecture. These aspects are not part of the main logic of the underlying program. Software security can be achieved using a combination of approaches, including but not limited to secure software designs, third part component validation, and secure coding practices. Memory safety is one feature in software security where we ensure that any object in memory has a valid pointer or a reference with a valid type. Aspect-Oriented Programming (AOP) is a paradigm that is concerned with capturing the cross-cutting concerns in code development. AOP is generally used for common cross-cutting concerns like logging and DB transaction managing. In this paper, we introduce the concepts that enable AOP to be used for the purpose of memory and type safety. We also present ideas for extending AOP in software security practices.

Keywords: aspect oriented programming, programming languages, software security, memory and type safety

Procedia PDF Downloads 127
44747 Elasto-Plastic Analysis of Structures Using Adaptive Gaussian Springs Based Applied Element Method

Authors: Mai Abdul Latif, Yuntian Feng

Abstract:

Applied Element Method (AEM) is a method that was developed to aid in the analysis of the collapse of structures. Current available methods cannot deal with structural collapse accurately; however, AEM can simulate the behavior of a structure from an initial state of no loading until collapse of the structure. The elements in AEM are connected with sets of normal and shear springs along the edges of the elements, that represent the stresses and strains of the element in that region. The elements are rigid, and the material properties are introduced through the spring stiffness. Nonlinear dynamic analysis has been widely modelled using the finite element method for analysis of progressive collapse of structures; however, difficulties in the analysis were found at the presence of excessively deformed elements with cracking or crushing, as well as having a high computational cost, and difficulties on choosing the appropriate material models for analysis. The Applied Element method is developed and coded to significantly improve the accuracy and also reduce the computational costs of the method. The scheme works for both linear elastic, and nonlinear cases, including elasto-plastic materials. This paper will focus on elastic and elasto-plastic material behaviour, where the number of springs required for an accurate analysis is tested. A steel cantilever beam is used as the structural element for the analysis. The first modification of the method is based on the Gaussian Quadrature to distribute the springs. Usually, the springs are equally distributed along the face of the element, but it was found that using Gaussian springs, only up to 2 springs were required for perfectly elastic cases, while with equal springs at least 5 springs were required. The method runs on a Newton-Raphson iteration scheme, and quadratic convergence was obtained. The second modification is based on adapting the number of springs required depending on the elasticity of the material. After the first Newton Raphson iteration, Von Mises stress conditions were used to calculate the stresses in the springs, and the springs are classified as elastic or plastic. Then transition springs, springs located exactly between the elastic and plastic region, are interpolated between regions to strictly identify the elastic and plastic regions in the cross section. Since a rectangular cross-section was analyzed, there were two plastic regions (top and bottom), and one elastic region (middle). The results of the present study show that elasto-plastic cases require only 2 springs for the elastic region, and 2 springs for the plastic region. This showed to improve the computational cost, reducing the minimum number of springs in elasto-plastic cases to only 6 springs. All the work is done using MATLAB and the results will be compared to models of structural elements using the finite element method in ANSYS.

Keywords: applied element method, elasto-plastic, Gaussian springs, nonlinear

Procedia PDF Downloads 225
44746 Geo Spatial Database for Railway Assets Management

Authors: Muhammad Umar

Abstract:

Safety and Assets management is considering a backbone of every department. GIS in the Railway become very important to Manage Assets and Security through Digital Maps and Web based GIS Maps. It provides a complete frame of work to the organization for the management of assets. Pakistan Railway is the most common and safest mode of traveling in Pakistan. Due to ever-increasing demand of transporting huge amount of information generated from various sources and this information must be accurate. This creates problems for Passengers and Administration that causes finical and time loss. GIS Solve this problem by Digital Maps & Database. It provides you a real time Spatial and Statistical analysis that helps you to communicate and exchange the information in a sophisticated way to the users. GIS Based Web system provides a facility to different end user to make query at a time as per requirements. This GIS System provides an advancement in an organization for a complete Monitoring, Safety and Decision System for tracks, Stations and Junctions that further use for the Analysis of different areas i.e. analysis of tracks, junctions and Stations in case of reconstruction, Rescue for rail accidents and Natural disasters .This Research work helps to reduce the financial loss and reduce human mistakes helps you provide a complete security and Management system of assets.

Keywords: Geographical Information System (GIS) for assets management, geo spatial database, railway assets management, Pakistan

Procedia PDF Downloads 491
44745 Non-Linear Static Analysis of Screwed Moment Connections in Cold-Formed Steel Frames

Authors: Jikhil Joseph, Satish Kumar S R.

Abstract:

Cold-formed steel frames are preferable for framed constructions due to its low seismic weights and results into low seismic forces, but on the contrary, significant lateral deflections are expected under seismic/wind loading. The various factors affecting the lateral stiffness of steel frames are the stiffness of connections, beams and columns. So, by increasing the stiffness of beam, column and making the connections rigid will enhance the lateral stiffness. The present study focused on Structural elements made of rectangular hollow sections and fastened with screwed in-plane moment connections for the building frames. The self-drilling screws can be easily drilled on either side of the connection area with the help of gusset plates. The strength of screwed connections can be made 1.2 times the connecting elements. However, achieving high stiffness in connections is also a challenging job. Hence in addition to beam and column stiffness’s the connection stiffness are also going to be a governing parameter in the lateral deflections of the frames. SAP 2000 Non-linear static analysis has been planned to study the seismic behavior of steel frames. The SAP model will be consisting of nonlinear spring model for the connection to account the semi-rigid connections and the nonlinear hinges will be assigned for beam and column sections according to FEMA 273 guidelines. The reliable spring and hinge parameters will be assigned based on an experimental and analytical database. The non-linear static analysis is mainly focused on the identification of various hinge formations and the estimation of lateral deflection and these will contribute as an inputs for the direct displacement-based Seismic design. The research output from this study are the modelling techniques and suitable design guidelines for the performance-based seismic design of cold-formed steel frames.

Keywords: buckling, cold formed steel, nonlinear static analysis, screwed connections

Procedia PDF Downloads 178
44744 Multiscale Edge Detection Based on Nonsubsampled Contourlet Transform

Authors: Enqing Chen, Jianbo Wang

Abstract:

It is well known that the wavelet transform provides a very effective framework for multiscale edges analysis. However, wavelets are not very effective in representing images containing distributed discontinuities such as edges. In this paper, we propose a novel multiscale edge detection method in nonsubsampled contourlet transform (NSCT) domain, which is based on the dominant multiscale, multidirection edge expression and outstanding edge location of NSCT. Through real images experiments, simulation results demonstrate that the proposed method is better than other edge detection methods based on Canny operator, wavelet and contourlet. Additionally, the proposed method also works well for noisy images.

Keywords: edge detection, NSCT, shift invariant, modulus maxima

Procedia PDF Downloads 488
44743 Application of Artificial Neural Network in Assessing Fill Slope Stability

Authors: An-Jui. Li, Kelvin Lim, Chien-Kuo Chiu, Benson Hsiung

Abstract:

This paper details the utilization of artificial intelligence (AI) in the field of slope stability whereby quick and convenient solutions can be obtained using the developed tool. The AI tool used in this study is the artificial neural network (ANN), while the slope stability analysis methods are the finite element limit analysis methods. The developed tool allows for the prompt prediction of the safety factors of fill slopes and their corresponding probability of failure (depending on the degree of variation of the soil parameters), which can give the practicing engineer a reasonable basis in their decision making. In fact, the successful use of the Extreme Learning Machine (ELM) algorithm shows that slope stability analysis is no longer confined to the conventional methods of modeling, which at times may be tedious and repetitive during the preliminary design stage where the focus is more on cost saving options rather than detailed design. Therefore, similar ANN-based tools can be further developed to assist engineers in this aspect.

Keywords: landslide, limit analysis, artificial neural network, soil properties

Procedia PDF Downloads 207
44742 Sentiment Analysis: Comparative Analysis of Multilingual Sentiment and Opinion Classification Techniques

Authors: Sannikumar Patel, Brian Nolan, Markus Hofmann, Philip Owende, Kunjan Patel

Abstract:

Sentiment analysis and opinion mining have become emerging topics of research in recent years but most of the work is focused on data in the English language. A comprehensive research and analysis are essential which considers multiple languages, machine translation techniques, and different classifiers. This paper presents, a comparative analysis of different approaches for multilingual sentiment analysis. These approaches are divided into two parts: one using classification of text without language translation and second using the translation of testing data to a target language, such as English, before classification. The presented research and results are useful for understanding whether machine translation should be used for multilingual sentiment analysis or building language specific sentiment classification systems is a better approach. The effects of language translation techniques, features, and accuracy of various classifiers for multilingual sentiment analysis is also discussed in this study.

Keywords: cross-language analysis, machine learning, machine translation, sentiment analysis

Procedia PDF Downloads 713
44741 Merging Sequence Diagrams Based Slicing

Authors: Bouras Zine Eddine, Talai Abdelouaheb

Abstract:

The need to merge software artifacts seems inherent to modern software development. Distribution of development over several teams and breaking tasks into smaller, more manageable pieces are an effective means to deal with the kind of complexity. In each case, the separately developed artifacts need to be assembled as efficiently as possible into a consistent whole in which the parts still function as described. Also, earlier changes are introduced into the life cycle and easier is their management by designers. Interaction-based specifications such as UML sequence diagrams have been found effective in this regard. As a result, sequence diagrams can be used not only for capturing system behaviors but also for merging changes in order to create a new version. The objective of this paper is to suggest a new approach to deal with the problem of software merging at the level of sequence diagrams by using the concept of dependence analysis that captures, formally, all mapping and differences between elements of sequence diagrams and serves as a key concept to create a new version of sequence diagram.

Keywords: system behaviors, sequence diagram merging, dependence analysis, sequence diagram slicing

Procedia PDF Downloads 340
44740 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 314
44739 Developing a Sustainable System to Deliver Early Intervention for Emotional Health through Australian Schools

Authors: Rebecca-Lee Kuhnert, Ron Rapee

Abstract:

Up to 15% of Australian youth will experience an emotional disorder, yet relatively few get the help they need. Schools provide an ideal environment through which we can identify young people who are struggling and provide them with appropriate help. Universal mental health screening is a method by which all young people in school can be quickly assessed for emotional disorders, after which identified youth can be linked to appropriate health services. Despite the obvious logic of this process, universal mental health screening has received little scientific evaluation and even less application in Australian schools. This study will develop methods for Australian education systems to help identify young people (aged 9-17 years old) who are struggling with existing and emerging emotional disorders. Prior to testing, a series of focus groups will be run to get feedback and input from young people, parents, teachers, and mental health professionals. They will be asked about their thoughts on school-based screening methods and and how to best help students at risk of emotional distress. Schools (n=91) across New South Wales, Australia will be randomised to do either immediate screening (in May 2021) or delayed screening (in February 2022). Students in immediate screening schools will complete a long online mental health screener consisting of standard emotional health questionnaires. Ultimately, this large set of items will be reduced to a small number of items to form the final brief screener. Students who score in the “at-risk” range on any measure of emotional health problems will be identified to schools and offered pathways to relevant help according to the most accepted and approved processes identified by the focus groups. Nine months later, the same process will occur among delayed screening schools. At this same time, students in the immediate screening schools will complete screening for a second time. This will allow a direct comparison of the emotional health and help-seeking between youth whose schools had engaged in the screening and pathways to care process (immediate) and those whose schools had not engaged in the process (delayed). It is hypothesised that there will be a significant increase in students who receive help from mental health support services after screening, compared with baseline. It is also predicted that all students will show significantly less emotional distress after screening and access to pathways of care. This study will be an important contribution to Australian youth mental health prevention and early intervention by determining whether school screening leads to a greater number of young people with emotional disorders getting the help that they need and improving their mental health outcomes.

Keywords: children and young people, early intervention, mental health, mental health screening, prevention, school-based mental health

Procedia PDF Downloads 96
44738 The Analysis of Drill Bit Optimization by the Application of New Electric Impulse Technology in Shallow Water Absheron Peninsula

Authors: Ayshan Gurbanova

Abstract:

Despite based on the fact that drill bit which is the smallest part of bottom hole assembly costs only in between 10% and 15% of the total expenses made, they are the first equipment that is in contact with the formation itself. Hence, it is consequential to choose the appropriate type and dimension of drilling bit, which will prevent majority of problems by not demanding many tripping procedure. However, within the advance in technology, it is now seamless to be beneficial in the terms of many concepts such as subsequent time of operation, energy, expenditure, power and so forth. With the intention of applying the method to Azerbaijan, the field of Shallow Water Absheron Peninsula has been suggested, where the mainland has been located 15 km away from the wildcat wells, named as “NKX01”. It has the water depth of 22 m as indicated. In 2015 and 2016, the seismic survey analysis of 2D and 3D have been conducted in contract area as well as onshore shallow water depth locations. With the aim of indicating clear elucidation, soil stability, possible submersible dangerous scenarios, geohazards and bathymetry surveys have been carried out as well. Within the seismic analysis results, the exact location of exploration wells have been determined and along with this, the correct measurement decisions have been made to divide the land into three productive zones. In the term of the method, Electric Impulse Technology (EIT) is based on discharge energies of electricity within the corrosivity in rock. Take it simply, the highest value of voltages could be created in the less range of nano time, where it is sent to the rock through electrodes’ baring as demonstrated below. These electrodes- higher voltage powered and grounded are placed on the formation which could be obscured in liquid. With the design, it is more seamless to drill horizontal well based on the advantage of loose contact of formation. There is also no chance of worn ability as there are no combustion, mechanical power exist. In the case of energy, the usage of conventional drilling accounts for 1000 𝐽/𝑐𝑚3 , where this value accounts for between 100 and 200 𝐽/𝑐𝑚3 in EIT. Last but not the least, from the test analysis, it has been yielded that it achieves the value of ROP more than 2 𝑚/ℎ𝑟 throughout 15 days. Taking everything into consideration, it is such a fact that with the comparison of data analysis, this method is highly applicable to the fields of Azerbaijan.

Keywords: drilling, drill bit cost, efficiency, cost

Procedia PDF Downloads 74
44737 Artificial Intelligence Based Comparative Analysis for Supplier Selection in Multi-Echelon Automotive Supply Chains via GEP and ANN Models

Authors: Seyed Esmail Seyedi Bariran, Laysheng Ewe, Amy Ling

Abstract:

Since supplier selection appears as a vital decision, selecting supplier based on the best and most accurate ways has a lot of importance for enterprises. In this study, a new Artificial Intelligence approach is exerted to remove weaknesses of supplier selection. The paper has three parts. First part is choosing the appropriate criteria for assessing the suppliers’ performance. Next one is collecting the data set based on experts. Afterwards, the data set is divided into two parts, the training data set and the testing data set. By the training data set the best structure of GEP and ANN are selected and to evaluate the power of the mentioned methods the testing data set is used. The result obtained shows that the accuracy of GEP is more than ANN. Moreover, unlike ANN, a mathematical equation is presented by GEP for the supplier selection.

Keywords: supplier selection, automotive supply chains, ANN, GEP

Procedia PDF Downloads 631
44736 A Comprehensive Study on the Porosity Effect of Ti-20Zr Alloy Produced by Powder Metallurgy as a Biomaterial

Authors: Eyyup Murat Karakurt, Yan Huang, Mehmet Kaya, Huseyin Demirtas

Abstract:

In this study, the effect of the porosity effect of Ti-20Zr alloy produced by powder metallurgy as a biomaterial was investigated experimentally. The Ti based alloys (Ti-20%Zr (at.) were produced under 300 MPa, for 6 h at 1200 °C. Afterward, the microstructure of the Ti-based alloys was analyzed by optical analysis, scanning electron microscopy, energy dispersive spectrometry. Moreover, compression tests were applied to determine the mechanical behaviour of samples. As a result, highly porous Ti-20Zr alloys exhibited an elastic modulus close to human bone. The results later were compared theoretically and experimentally.

Keywords: porosity effect, Ti based alloys, elastic modulus, compression test

Procedia PDF Downloads 230
44735 Development of Nondestructive Imaging Analysis Method Using Muonic X-Ray with a Double-Sided Silicon Strip Detector

Authors: I-Huan Chiu, Kazuhiko Ninomiya, Shin’ichiro Takeda, Meito Kajino, Miho Katsuragawa, Shunsaku Nagasawa, Atsushi Shinohara, Tadayuki Takahashi, Ryota Tomaru, Shin Watanabe, Goro Yabu

Abstract:

In recent years, a nondestructive elemental analysis method based on muonic X-ray measurements has been developed and applied for various samples. Muonic X-rays are emitted after the formation of a muonic atom, which occurs when a negatively charged muon is captured in a muon atomic orbit around the nucleus. Because muonic X-rays have higher energy than electronic X-rays due to the muon mass, they can be measured without being absorbed by a material. Thus, estimating the two-dimensional (2D) elemental distribution of a sample became possible using an X-ray imaging detector. In this work, we report a non-destructive imaging experiment using muonic X-rays at Japan Proton Accelerator Research Complex. The irradiated target consisted of polypropylene material, and a double-sided silicon strip detector, which was developed as an imaging detector for astronomical observation, was employed. A peak corresponding to muonic X-rays from the carbon atoms in the target was clearly observed in the energy spectrum at an energy of 14 keV, and 2D visualizations were successfully reconstructed to reveal the projection image from the target. This result demonstrates the potential of the non-destructive elemental imaging method that is based on muonic X-ray measurement. To obtain a higher position resolution for imaging a smaller target, a new detector system will be developed to improve the statistical analysis in further research.

Keywords: DSSD, muon, muonic X-ray, imaging, non-destructive analysis

Procedia PDF Downloads 205
44734 The Impact of Artificial Intelligence on Construction Projects

Authors: Muller Salah Zaky Toudry

Abstract:

The complexity arises in defining the development great due to its notion, based on inherent market situations and their requirements, the diverse stakeholders itself and their desired output. An quantitative survey based totally approach was adopted in this optimistic examine. A questionnaire-primarily based survey was performed for the assessment of production fine belief and expectations within the context of excellent development technique. The survey feedback of experts of the leading creation corporations/companies of Pakistan production industry have been analyzed. The monetary ability, organizational shape, and production revel in of the construction companies shaped basis for their selection. The great belief become located to be venture-scope-orientated and taken into consideration as an extra cost for a production assignment. Any excellent improvement technique changed into expected to maximize the profit for the employer, via enhancing the productiveness in a creation project. The look at is beneficial for the construction specialists to evaluate the prevailing creation great perception and the expectations from implementation of any pleasant improvement approach in production projects.

Keywords: correlation analysis, lean construction tools, lean construction, logistic regression analysis, risk management, safety construction quality, expectation, improvement, perception client loyalty, NPS, pre-construction, schedule reduction

Procedia PDF Downloads 16