Search results for: paper analysis techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 47886

Search results for: paper analysis techniques

47256 Analysis of Ionospheric Variations over Japan during 23rd Solar Cycle Using Wavelet Techniques

Authors: C. S. Seema, P. R. Prince

Abstract:

The characterization of spatio-temporal inhomogeneities occurring in the ionospheric F₂ layer is remarkable since these variations are direct consequences of electrodynamical coupling between magnetosphere and solar events. The temporal and spatial variations of the F₂ layer, which occur with a period of several days or even years, mainly owe to geomagnetic and meteorological activities. The hourly F₂ layer critical frequency (foF2) over 23rd solar cycle (1996-2008) of three ionosonde stations (Wakkanai, Kokunbunji, and Okinawa) in northern hemisphere, which falls within same longitudinal span, is analyzed using continuous wavelet techniques. Morlet wavelet is used to transform continuous time series data of foF2 to a two dimensional time-frequency space, quantifying the time evolution of the oscillatory modes. The presence of significant time patterns (periodicities) at a particular time period and the time location of each periodicity are detected from the two-dimensional representation of the wavelet power, in the plane of scale and period of the time series. The mean strength of each periodicity over the entire period of analysis is studied using global wavelet spectrum. The quasi biennial, annual, semiannual, 27 day, diurnal and 12 hour variations of foF2 are clearly evident in the wavelet power spectra in all the three stations. Critical frequency oscillations with multi-day periods (2-3 days and 9 days in the low latitude station, 6-7 days in all stations and 15 days in mid-high latitude station) are also superimposed over large time scaled variations.

Keywords: continuous wavelet analysis, critical frequency, ionosphere, solar cycle

Procedia PDF Downloads 219
47255 Digital Art Fabric Prints: Procedure, Process and Progress

Authors: Tripti Singh

Abstract:

Digital tools are merging boundaries of different mediums as endeavoured artists exploring new areas. Digital fabric printing has motivated artists to create prints by combining images acquired by photograph, scanned images, computer graphics and microscopic imaginary etc to name few, with traditional media such as hand drawing, weaving, hand printed patterns, printing making techniques and so on. It opened whole new world of possibilities for artists to search, research and combine old and contemporary mediums for their unique art prints. As artistic medium digital art fabrics have aesthetic values which have impact and influence on not only on a personality but also interiors of a living or work space. In this way it can be worn, as fashion statement and also an interior decoration. Digital art fabric prints gives opportunity to print almost everything on any fabric with long lasting prints quality. Single edition and limited editions are possible for maintaining scarcity and uniqueness of an art form. These fabric prints fulfill today’s need, as they are eco-friendly in nature and they produce less wastage compared to traditional fabric printing techniques. These prints can be used to make unique and customized curtains, quilts, clothes, bags, furniture, dolls, pillows, framed artwork, costumes, banners and much, much more. This paper will explore the procedure, process, and progress techniques of digital art fabric printing in depth with suitable pictorial examples.

Keywords: digital art, fabric prints, digital fabric prints, new media

Procedia PDF Downloads 512
47254 Defining a Pathway to Zero Energy Building: A Case Study on Retrofitting an Old Office Building into a Net Zero Energy Building for Hot-Humid Climate

Authors: Kwame B. O. Amoah

Abstract:

This paper focuses on retrofitting an old existing office building to a net-zero energy building (NZEB). An existing small office building in Melbourne, Florida, was chosen as a case study to integrate state-of-the-art design strategies and energy-efficient building systems to improve building performance and reduce energy consumption. The study aimed to explore possible ways to maximize energy savings and renewable energy generation sources to cover the building's remaining energy needs necessary to achieve net-zero energy goals. A series of retrofit options were reviewed and adopted with some significant additional decision considerations. Detailed processes and considerations leading to zero energy are well documented in this study, with lessons learned adequately outlined. Based on building energy simulations, multiple design considerations were investigated, such as emerging state-of-the-art technologies, material selection, improvements to the building envelope, optimization of the HVAC, lighting systems, and occupancy loads analysis, as well as the application of renewable energy sources. The comparative analysis of simulation results was used to determine how specific techniques led to energy saving and cost reductions. The research results indicate this small office building can meet net-zero energy use after appropriate design manipulations and renewable energy sources.

Keywords: energy consumption, building energy analysis, energy retrofits, energy-efficiency

Procedia PDF Downloads 220
47253 Diversity in Finance Literature Revealed through the Lens of Machine Learning: A Topic Modeling Approach on Academic Papers

Authors: Oumaima Lahmar

Abstract:

This paper aims to define a structured topography for finance researchers seeking to navigate the body of knowledge in their extrapolation of finance phenomena. To make sense of the body of knowledge in finance, a probabilistic topic modeling approach is applied on 6000 abstracts of academic articles published in three top journals in finance between 1976 and 2020. This approach combines both machine learning techniques and natural language processing to statistically identify the conjunctions between research articles and their shared topics described each by relevant keywords. The topic modeling analysis reveals 35 coherent topics that can well depict finance literature and provide a comprehensive structure for the ongoing research themes. Comparing the extracted topics to the Journal of Economic Literature (JEL) classification system, a significant similarity was highlighted between the characterizing keywords. On the other hand, we identify other topics that do not match the JEL classification despite being relevant in the finance literature.

Keywords: finance literature, textual analysis, topic modeling, perplexity

Procedia PDF Downloads 169
47252 Sentiment Analysis of Consumers’ Perceptions on Social Media about the Main Mobile Providers in Jamaica

Authors: Sherrene Bogle, Verlia Bogle, Tyrone Anderson

Abstract:

In recent years, organizations have become increasingly interested in the possibility of analyzing social media as a means of gaining meaningful feedback about their products and services. The aspect based sentiment analysis approach is used to predict the sentiment for Twitter datasets for Digicel and Lime, the main mobile companies in Jamaica, using supervised learning classification techniques. The results indicate an average of 82.2 percent accuracy in classifying tweets when comparing three separate classification algorithms against the purported baseline of 70 percent and an average root mean squared error of 0.31. These results indicate that the analysis of sentiment on social media in order to gain customer feedback can be a viable solution for mobile companies looking to improve business performance.

Keywords: machine learning, sentiment analysis, social media, supervised learning

Procedia PDF Downloads 440
47251 Automatic Number Plate Recognition System Based on Deep Learning

Authors: T. Damak, O. Kriaa, A. Baccar, M. A. Ben Ayed, N. Masmoudi

Abstract:

In the last few years, Automatic Number Plate Recognition (ANPR) systems have become widely used in the safety, the security, and the commercial aspects. Forethought, several methods and techniques are computing to achieve the better levels in terms of accuracy and real time execution. This paper proposed a computer vision algorithm of Number Plate Localization (NPL) and Characters Segmentation (CS). In addition, it proposed an improved method in Optical Character Recognition (OCR) based on Deep Learning (DL) techniques. In order to identify the number of detected plate after NPL and CS steps, the Convolutional Neural Network (CNN) algorithm is proposed. A DL model is developed using four convolution layers, two layers of Maxpooling, and six layers of fully connected. The model was trained by number image database on the Jetson TX2 NVIDIA target. The accuracy result has achieved 95.84%.

Keywords: ANPR, CS, CNN, deep learning, NPL

Procedia PDF Downloads 304
47250 Scientific Theoretical Fundamentals of Comparative Analysis

Authors: Khalliyeva Gulnoz Iskandarovna, Mannonova Feruzabonu Sherali Qizi

Abstract:

A scientific field called comparative literature or literary comparative studies compares two or more literary phenomena. One of the most important scientific fields nowadays, when global social, cultural, and literary relations are growing daily, is comparative literature. Any comparative investigation reveals shared and unique characteristics of literary phenomena, which provide the cornerstone for the creation of overarching theoretical principles that apply to all literature. Comparative analysis consists of objects, and they are their constituents. For researchers, it is enough to know this. Comparative analysis, in addition to the above-mentioned actions, also focuses on comparing the components of the objects of analysis with each other. The purpose of this article is to investigate comparative analysis in literature and to identify similarities and differences between comparable objects. Students, teachers, and researchers should be able to describe comparative research techniques and their fundamental ideas when studying this topic. They should also have a basic understanding of comparative literature and their summary.

Keywords: object, natural, social, spiritual, epistemological, logical, methodological, methodological, axiological tasks, stages of comparison, environment, internal features, and typical situations

Procedia PDF Downloads 57
47249 Synthesis and Characterization of Hydroxyapatite from Biowaste for Potential Medical Application

Authors: M. D. H. Beg, John O. Akindoyo, Suriati Ghazali, Nitthiyah Jeyaratnam

Abstract:

Over the period of time, several approaches have been undertaken to mitigate the challenges associated with bone regeneration. This includes but not limited to xenografts, allografts, autografts as well as artificial substitutions like bioceramics, synthetic cements and metals. The former three techniques often come along with peculiar limitation and problems such as morbidity, availability, disease transmission, collateral site damage or absolute rejection by the body as the case may be. Synthetic routes remain the only feasible alternative option for treatment of bone defects. Hydroxyapatite (HA) is very compatible and suitable for this application. However, most of the common methods for HA synthesis are either expensive, complicated or environmentally unfriendly. Interestingly, extraction of HA from bio-wastes have been perceived not only to be cost effective, but also environment friendly. In this research, HA was synthesized from bio-waste: namely bovine bones through three different methods which are hydrothermal chemical processes, ultrasound assisted synthesis and ordinary calcination techniques. Structure and property analysis of the HA was carried out through different characterization techniques such as TGA, FTIR, and XRD. All the methods applied were able to produce HA with similar compositional properties to biomaterials found in human calcified tissues. Calcination process was however observed to be more efficient as it eliminated all the organic components from the produced HA. The HA synthesized is unique for its minimal cost and environmental friendliness. It is also perceived to be suitable for tissue and bone engineering applications.

Keywords: hydroxyapatite, bone, calcination, biowaste

Procedia PDF Downloads 247
47248 Construction of Large Scale UAVs Using Homebuilt Composite Techniques

Authors: Brian J. Kozak, Joshua D. Shipman, Peng Hao Wang, Blake Shipp

Abstract:

The unmanned aerial system (UAS) industry is growing at a rapid pace. This growth has increased the demand for low cost, custom made and high strength unmanned aerial vehicles (UAV). The area of most growth is in the area of 25 kg to 200 kg vehicles. Vehicles this size are beyond the size and scope of simple wood and fabric designs commonly found in hobbyist aircraft. These high end vehicles require stronger materials to complete their mission. Traditional aircraft construction materials such as aluminum are difficult to use without machining or advanced computer controlled tooling. However, by using general aviation composite aircraft homebuilding techniques and materials, a large scale UAV can be constructed cheaply and easily. Furthermore, these techniques could be used to easily manufacture cost made composite shapes and airfoils that would be cost prohibitive when using metals. These homebuilt aircraft techniques are being demonstrated by the researchers in the construction of a 75 kg aircraft.

Keywords: composite aircraft, homebuilding, unmanned aerial system industry, UAS, unmanned aerial vehicles, UAV

Procedia PDF Downloads 135
47247 Aerodynamic Modelling of Unmanned Aerial System through Computational Fluid Dynamics: Application to the UAS-S45 Balaam

Authors: Maxime A. J. Kuitche, Ruxandra M. Botez, Arthur Guillemin

Abstract:

As the Unmanned Aerial Systems have found diverse utilities in both military and civil aviation, the necessity to obtain an accurate aerodynamic model has shown an enormous growth of interest. Recent modeling techniques are procedures using optimization algorithms and statistics that require many flight tests and are therefore extremely demanding in terms of costs. This paper presents a procedure to estimate the aerodynamic behavior of an unmanned aerial system from a numerical approach using computational fluid dynamic analysis. The study was performed using an unstructured mesh obtained from a grid convergence analysis at a Mach number of 0.14, and at an angle of attack of 0°. The flow around the aircraft was described using a standard k-ω turbulence model. Thus, the Reynold Averaged Navier-Stokes (RANS) equations were solved using ANSYS FLUENT software. The method was applied on the UAS-S45 designed and manufactured by Hydra Technologies in Mexico. The lift, the drag, and the pitching moment coefficients were obtained at different angles of attack for several flight conditions defined in terms of altitudes and Mach numbers. The results obtained from the Computational Fluid Dynamics analysis were compared with the results obtained by using the DATCOM semi-empirical procedure. This comparison has indicated that our approach is highly accurate and that the aerodynamic model obtained could be useful to estimate the flight dynamics of the UAS-S45.

Keywords: aerodynamic modelling, CFD Analysis, ANSYS FLUENT, UAS-S45

Procedia PDF Downloads 373
47246 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier

Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh

Abstract:

This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.

Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems

Procedia PDF Downloads 42
47245 Assessing the Impact of Low Carbon Technology Integration on Electricity Distribution Networks: Advancing towards Local Area Energy Planning

Authors: Javier Sandoval Bustamante, Pardis Sheikhzadeh, Vijayanarasimha Hindupur Pakka

Abstract:

In the pursuit of achieving net-zero carbon emissions, the integration of low carbon technologies into electricity distribution networks is paramount. This paper delves into the critical assessment of how the integration of low carbon technologies, such as heat pumps, electric vehicle chargers, and photovoltaic systems, impacts the infrastructure and operation of electricity distribution networks. The study employs rigorous methodologies, including power flow analysis and headroom analysis, to evaluate the feasibility and implications of integrating these technologies into existing distribution systems. Furthermore, the research utilizes Local Area Energy Planning (LAEP) methodologies to guide local authorities and distribution network operators in formulating effective plans to meet regional and national decarbonization objectives. Geospatial analysis techniques, coupled with building physics and electric energy systems modeling, are employed to develop geographic datasets aimed at informing the deployment of low carbon technologies at the local level. Drawing upon insights from the Local Energy Net Zero Accelerator (LENZA) project, a comprehensive case study illustrates the practical application of these methodologies in assessing the rollout potential of LCTs. The findings not only shed light on the technical feasibility of integrating low carbon technologies but also provide valuable insights into the broader transition towards a sustainable and electrified energy future. This paper contributes to the advancement of knowledge in power electrical engineering by providing empirical evidence and methodologies to support the integration of low carbon technologies into electricity distribution networks. The insights gained are instrumental for policymakers, utility companies, and stakeholders involved in navigating the complex challenges of energy transition and achieving long-term sustainability goals.

Keywords: energy planning, energy systems, digital twins, power flow analysis, headroom analysis

Procedia PDF Downloads 55
47244 Process for Separating and Recovering Materials from Kerf Slurry Waste

Authors: Tarik Ouslimane, Abdenour Lami, Salaheddine Aoudj, Mouna Hecini, Ouahiba Bouchelaghem, Nadjib Drouiche

Abstract:

Slurry waste is a byproduct generated from the slicing process of multi-crystalline silicon ingots. This waste can be used as a secondary resource to recover high purity silicon which has a great economic value. From the management perspective, the ever increasing generation of kerf slurry waste loss leads to significant challenges for the photovoltaic industry due to the current low use of slurry waste for silicon recovery. Slurry waste, in most cases, contains silicon, silicon carbide, metal fragments and mineral-oil-based or glycol-based slurry vehicle. As a result, of the global scarcity of high purity silicon supply, the high purity silicon content in slurry has increasingly attracted interest for research. This paper presents a critical overview of the current techniques employed for high purity silicon recovery from kerf slurry waste. Hydrometallurgy is continuously a matter of study and research. However, in this review paper, several new techniques about the process of high purity silicon recovery from slurry waste are introduced. The purpose of the information presented is to improve the development of a clean and effective recovery process of high purity silicon from slurry waste.

Keywords: Kerf-loss, slurry waste, silicon carbide, silicon recovery, photovoltaic, high purity silicon, polyethylen glycol

Procedia PDF Downloads 308
47243 Modal Analysis of a Cantilever Beam Using an Inexpensive Smartphone Camera: Motion Magnification Technique

Authors: Hasan Hassoun, Jaafar Hallal, Denis Duhamel, Mohammad Hammoud, Ali Hage Diab

Abstract:

This paper aims to prove the accuracy of an inexpensive smartphone camera as a non-contact vibration sensor to recover the vibration modes of a vibrating structure such as a cantilever beam. A video of a vibrating beam is filmed using a smartphone camera and then processed by the motion magnification technique. Based on this method, the first two natural frequencies and their associated mode shapes are estimated experimentally and compared to the analytical ones. Results show a relative error of less than 4% between the experimental and analytical approaches for the first two natural frequencies of the beam. Also, for the first two-mode shapes, a Modal Assurance Criterion (MAC) value of above 0.9 between the two approaches is obtained. This slight error between the different techniques ensures the viability of a cheap smartphone camera as a non-contact vibration sensor, particularly for structures vibrating at relatively low natural frequencies.

Keywords: modal analysis, motion magnification, smartphone camera, structural vibration, vibration modes

Procedia PDF Downloads 146
47242 Automatic Calibration of Agent-Based Models Using Deep Neural Networks

Authors: Sima Najafzadehkhoei, George Vega Yon

Abstract:

This paper presents an approach for calibrating Agent-Based Models (ABMs) efficiently, utilizing Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. These machine learning techniques are applied to Susceptible-Infected-Recovered (SIR) models, which are a core framework in the study of epidemiology. Our method replicates parameter values from observed trajectory curves, enhancing the accuracy of predictions when compared to traditional calibration techniques. Through the use of simulated data, we train the models to predict epidemiological parameters more accurately. Two primary approaches were explored: one where the number of susceptible, infected, and recovered individuals is fully known, and another using only the number of infected individuals. Our method shows promise for application in other ABMs where calibration is computationally intensive and expensive.

Keywords: ABM, calibration, CNN, LSTM, epidemiology

Procedia PDF Downloads 23
47241 The Study of Dengue Fever Outbreak in Thailand Using Geospatial Techniques, Satellite Remote Sensing Data and Big Data

Authors: Tanapat Chongkamunkong

Abstract:

The objective of this paper is to present a practical use of Geographic Information System (GIS) to the public health from spatial correlation between multiple factors and dengue fever outbreak. Meteorological factors, demographic factors and environmental factors are compiled using GIS techniques along with the Global Satellite Mapping Remote Sensing (RS) data. We use monthly dengue fever cases, population density, precipitation, Digital Elevation Model (DEM) data. The scope cover study area under climate change of the El Niño–Southern Oscillation (ENSO) indicated by sea surface temperature (SST) and study area in 12 provinces of Thailand as remote sensing (RS) data from January 2007 to December 2014.

Keywords: dengue fever, sea surface temperature, Geographic Information System (GIS), remote sensing

Procedia PDF Downloads 197
47240 Mining User-Generated Contents to Detect Service Failures with Topic Model

Authors: Kyung Bae Park, Sung Ho Ha

Abstract:

Online user-generated contents (UGC) significantly change the way customers behave (e.g., shop, travel), and a pressing need to handle the overwhelmingly plethora amount of various UGC is one of the paramount issues for management. However, a current approach (e.g., sentiment analysis) is often ineffective for leveraging textual information to detect the problems or issues that a certain management suffers from. In this paper, we employ text mining of Latent Dirichlet Allocation (LDA) on a popular online review site dedicated to complaint from users. We find that the employed LDA efficiently detects customer complaints, and a further inspection with the visualization technique is effective to categorize the problems or issues. As such, management can identify the issues at stake and prioritize them accordingly in a timely manner given the limited amount of resources. The findings provide managerial insights into how analytics on social media can help maintain and improve their reputation management. Our interdisciplinary approach also highlights several insights by applying machine learning techniques in marketing research domain. On a broader technical note, this paper illustrates the details of how to implement LDA in R program from a beginning (data collection in R) to an end (LDA analysis in R) since the instruction is still largely undocumented. In this regard, it will help lower the boundary for interdisciplinary researcher to conduct related research.

Keywords: latent dirichlet allocation, R program, text mining, topic model, user generated contents, visualization

Procedia PDF Downloads 186
47239 Design, Development by Functional Analysis in UML and Static Test of a Multimedia Voice and Video Communication Platform on IP for a Use Adapted to the Context of Local Businesses in Lubumbashi

Authors: Blaise Fyama, Elie Museng, Grace Mukoma

Abstract:

In this article we present a java implementation of video telephony using the SIP protocol (Session Initiation Protocol). After a functional analysis of the SIP protocol, we relied on the work of Italian researchers of University of Parma-Italy to acquire adequate libraries for the development of our own communication tool. In order to optimize the code and improve the prototype, we used, in an incremental approach, test techniques based on a static analysis based on the evaluation of the complexity of the software with the application of metrics and the number cyclomatic of Mccabe. The objective is to promote the emergence of local start-ups producing IP video in a well understood local context. We have arrived at the creation of a video telephony tool whose code is optimized.

Keywords: static analysis, coding complexity metric mccabe, Sip, uml

Procedia PDF Downloads 117
47238 Enhanced Calibration Map for a Four-Hole Probe for Measuring High Flow Angles

Authors: Jafar Mortadha, Imran Qureshi

Abstract:

This research explains and compares the modern techniques used for measuring the flow angles of a flowing fluid with the traditional technique of using multi-hole pressure probes. In particular, the focus of the study is on four-hole probes, which offer great reliability and benefits in several applications where the use of modern measurement techniques is either inconvenient or impractical. Due to modern advancements in manufacturing, small multi-hole pressure probes can be made with high precision, which eliminates the need for calibrating every manufactured probe. This study aims to improve the range of calibration maps for a four-hole probe to allow high flow angles to be measured accurately. The research methodology comprises a literature review of the successful calibration definitions that have been implemented on five-hole probes. These definitions are then adapted and applied on a four-hole probe using a set of raw pressures data. A comparison of the different definitions will be carried out in Matlab and the results will be analyzed to determine the best calibration definition. Taking simplicity of implementation into account as well as the reliability of flow angles estimation, an adapted technique from a research paper written in 2002 offered the most promising outcome. Consequently, the method is seen as a good enhancement for four-hole probes and it can substitute for the existing calibration definitions that offer less accuracy.

Keywords: calibration definitions, calibration maps, flow measurement techniques, four-hole probes, multi-hole pressure probes

Procedia PDF Downloads 294
47237 Training a Neural Network Using Input Dropout with Aggressive Reweighting (IDAR) on Datasets with Many Useless Features

Authors: Stylianos Kampakis

Abstract:

This paper presents a new algorithm for neural networks called “Input Dropout with Aggressive Re-weighting” (IDAR) aimed specifically at datasets with many useless features. IDAR combines two techniques (dropout of input neurons and aggressive re weighting) in order to eliminate the influence of noisy features. The technique can be seen as a generalization of dropout. The algorithm is tested on two different benchmark data sets: a noisy version of the iris dataset and the MADELON data set. Its performance is compared against three other popular techniques for dealing with useless features: L2 regularization, LASSO and random forests. The results demonstrate that IDAR can be an effective technique for handling data sets with many useless features.

Keywords: neural networks, feature selection, regularization, aggressive reweighting

Procedia PDF Downloads 454
47236 Current-Based Multiple Faults Detection in Electrical Motors

Authors: Moftah BinHasan

Abstract:

Induction motors (IM) are vital components in industrial processes whose failure may yield to an unexpected interruption at the industrial plant, with highly incurred consequences in costs, product quality, and safety. Among different detection approaches proposed in the literature, that based on stator current monitoring termed as Motor Current Signature Analysis (MCSA) is the most preferred. MCSA is advantageous due to its non-invasive properties. The popularity of motor current signature analysis comes from being that the current consists of motor harmonics, around the supply frequency, which show some properties related to different situations of healthy and faulty conditions. One of the techniques used with machine line current resorts to spectrum analysis. Besides discussing the fundamentals of MCSA and its applications in the condition monitoring arena, this paper shows a summary of the most frequent faults and their consequence signatures on the stator current spectrum of an induction motor. In addition, this article presents different case studies of induction motor fault diagnosis. These faults were seeded in the machine which was run for more than an hour for each test before the results were recorded for the faulty situations. These results are then compared with those for the healthy cases that were recorded earlier.

Keywords: induction motor, condition monitoring, fault diagnosis, MCSA, rotor, stator, bearing, eccentricity

Procedia PDF Downloads 457
47235 Estimation of Transition and Emission Probabilities

Authors: Aakansha Gupta, Neha Vadnere, Tapasvi Soni, M. Anbarsi

Abstract:

Protein secondary structure prediction is one of the most important goals pursued by bioinformatics and theoretical chemistry; it is highly important in medicine and biotechnology. Some aspects of protein functions and genome analysis can be predicted by secondary structure prediction. This is used to help annotate sequences, classify proteins, identify domains, and recognize functional motifs. In this paper, we represent protein secondary structure as a mathematical model. To extract and predict the protein secondary structure from the primary structure, we require a set of parameters. Any constants appearing in the model are specified by these parameters, which also provide a mechanism for efficient and accurate use of data. To estimate these model parameters there are many algorithms out of which the most popular one is the EM algorithm or called the Expectation Maximization Algorithm. These model parameters are estimated with the use of protein datasets like RS126 by using the Bayesian Probabilistic method (data set being categorical). This paper can then be extended into comparing the efficiency of EM algorithm to the other algorithms for estimating the model parameters, which will in turn lead to an efficient component for the Protein Secondary Structure Prediction. Further this paper provides a scope to use these parameters for predicting secondary structure of proteins using machine learning techniques like neural networks and fuzzy logic. The ultimate objective will be to obtain greater accuracy better than the previously achieved.

Keywords: model parameters, expectation maximization algorithm, protein secondary structure prediction, bioinformatics

Procedia PDF Downloads 479
47234 Software Quality Assurance in Component Based Software Development – a Survey Analysis

Authors: Abeer Toheed Quadri, Maria Abubakar, Mehreen Sirshar

Abstract:

Component Based Software Development (CBSD) is a new trend in software development. Selection of quality components is not enough to ensure software quality in Component Based Software System (CBSS). A software product is considered to be a quality product if it satisfies its customer’s needs and has minimum defects. Authors’ survey different research papers and analyzes various techniques which ensure software quality in component based software development. This paper includes an investigation about how to improve the quality of a component based software system without effecting quality attributes. The reported information is identified from literature survey. The developments of component based systems are rising as they reduce the development time, effort and cost by means of reuse. After analysis, it has been explored that in order to achieve the quality in a CBSS we need to have the components that are certified through software measure because the predictability of software quality attributes of system depend on the quality attributes of the constituent components, integration process and the framework used.

Keywords: CBSD (component based software development), CBSS (component based software system), quality components, SQA (software quality assurance)

Procedia PDF Downloads 410
47233 Data Analysis to Uncover Terrorist Attacks Using Data Mining Techniques

Authors: Saima Nazir, Mustansar Ali Ghazanfar, Sanay Muhammad Umar Saeed, Muhammad Awais Azam, Saad Ali Alahmari

Abstract:

Terrorism is an important and challenging concern. The entire world is threatened by only few sophisticated terrorist groups and especially in Gulf Region and Pakistan, it has become extremely destructive phenomena in recent years. Predicting the pattern of attack type, attack group and target type is an intricate task. This study offers new insight on terrorist group’s attack type and its chosen target. This research paper proposes a framework for prediction of terrorist attacks using the historical data and making an association between terrorist group, their attack type and target. Analysis shows that the number of attacks per year will keep on increasing, and Al-Harmayan in Saudi Arabia, Al-Qai’da in Gulf Region and Tehreek-e-Taliban in Pakistan will remain responsible for many future terrorist attacks. Top main targets of each group will be private citizen & property, police, government and military sector under constant circumstances.

Keywords: data mining, counter terrorism, machine learning, SVM

Procedia PDF Downloads 405
47232 Experimental, Computational Fluid Dynamics and Theoretical Study of Cyclone Performance Based on Inlet Velocity and Particle Loading Rate

Authors: Sakura Ganegama Bogodage, Andrew Yee Tat Leung

Abstract:

This paper describes experimental, Computational Fluid Dynamics (CFD) and theoretical analysis of a cyclone performance, operated 1.0 g/m3 solid loading rate, at two different inlet velocities (5 m/s and 10 m/s). Comparing experimental results with theoretical and CFD simulation results, it is pronounced that the influence of solid in processing flow is significant than expected. Experimental studies based on gas- solid flows of cyclone separators are complicated as they required advanced sensitive measuring techniques, especially flow characteristics. Thus, CFD modelling and theoretical analysis are economical in analyzing cyclone separator performance but detailed clarifications of the application of these in cyclone separator performance evaluation is not yet discussed. The present study shows the limitations of influencing parameters of CFD and theoretical considerations, comparing experimental results and flow characteristics from CFD modelling.

Keywords: cyclone performance, inlet velocity, pressure drop, solid loading rate

Procedia PDF Downloads 236
47231 Intrusion Detection in SCADA Systems

Authors: Leandros A. Maglaras, Jianmin Jiang

Abstract:

The protection of the national infrastructures from cyberattacks is one of the main issues for national and international security. The funded European Framework-7 (FP7) research project CockpitCI introduces intelligent intrusion detection, analysis and protection techniques for Critical Infrastructures (CI). The paradox is that CIs massively rely on the newest interconnected and vulnerable Information and Communication Technology (ICT), whilst the control equipment, legacy software/hardware, is typically old. Such a combination of factors may lead to very dangerous situations, exposing systems to a wide variety of attacks. To overcome such threats, the CockpitCI project combines machine learning techniques with ICT technologies to produce advanced intrusion detection, analysis and reaction tools to provide intelligence to field equipment. This will allow the field equipment to perform local decisions in order to self-identify and self-react to abnormal situations introduced by cyberattacks. In this paper, an intrusion detection module capable of detecting malicious network traffic in a Supervisory Control and Data Acquisition (SCADA) system is presented. Malicious data in a SCADA system disrupt its correct functioning and tamper with its normal operation. OCSVM is an intrusion detection mechanism that does not need any labeled data for training or any information about the kind of anomaly is expecting for the detection process. This feature makes it ideal for processing SCADA environment data and automates SCADA performance monitoring. The OCSVM module developed is trained by network traces off line and detects anomalies in the system real time. The module is part of an IDS (intrusion detection system) developed under CockpitCI project and communicates with the other parts of the system by the exchange of IDMEF messages that carry information about the source of the incident, the time and a classification of the alarm.

Keywords: cyber-security, SCADA systems, OCSVM, intrusion detection

Procedia PDF Downloads 552
47230 Examining Criminology via Diverse Philosophical Paradigms: Considering the Nomological-Deductive Model of Science versus the Humanistic Tradition

Authors: William R. Crawley

Abstract:

The current paper provides an examination of the primary conceptual and historical foundations leading to contemporary perspectives in criminological theory. This subject area involves the examination of theory that is vast and highly interdisciplinary but must, at its core, consider several postulates. The following areas of consideration will be the focus of this examination: presentation of various definitions of criminology as a discipline and attention to a dialogue which inquires as to whether criminological modes of explanation can be regarded as scientific with respect to focus, methods, and findings – e.g., conceptualization, operationalization, measurement strategies, analytical techniques, etc. Specifically, two opposing philosophical frameworks—naturalistic and anti-naturalistic philosophy—are examined by means of conceptual analysis for their necessary and sufficient conditions. Like all academic disciplines, for practitioners and students of criminology to understand and effectively use insights and discoveries, it is imperative that disciplinary axioms and methodologies are critically scrutinized. This paper provides a primer to this critique.

Keywords: anti-naturalistic philosophy, humanistic tradition, is criminology a science, naturalistic philosophy, nomological-deductive model

Procedia PDF Downloads 68
47229 Evaluating The Effects of Fundamental Analysis on Earnings Per Share Concept in Stock Valuation in the Zimbabwe Stock Exchange Market

Authors: Brian Basvi

Abstract:

A technique for analyzing a security's intrinsic value is called fundamental analysis. It involves looking at relevant financial, economic, and other qualitative and quantitative aspects. Earnings Per Share (EPS), a crucial metric in fundamental analysis, is calculated by dividing a company's net income by the total number of outstanding shares. With more than 70 listed businesses, the Zimbabwe Stock Exchange (ZSE) is the primary stock exchange in Zimbabwe. This study applies the EPS financial ratio and stock valuation techniques to historical stock data from 68 companies listed on the Zimbabwe Stock Exchange. According to a ZSE study, EPS significantly affects share prices that are listed on the market. The study's objective was to assess how fundamental analysis affected the idea of EPS in ZSE stock valuation. It concluded that EPS is an important consideration for investors when they make judgments about their investments. According to the study's findings, fundamental analysis is a useful tool for ZSE investors since it offers insightful information about a company's financial performance and aids in decision-making. Investors can have a better understanding of a company's underlying worth and prospects for future growth by looking into EPS and other basic aspects.

Keywords: fundamental analysis, stock valuation, EPS, share pricing

Procedia PDF Downloads 42
47228 Effect of Microstructure of Graphene Oxide Fabricated through Different Self-Assembly Techniques on Alcohol Dehydration

Authors: Wei-Song Hung

Abstract:

We utilized pressure, vacuum, and evaporation-assisted self-assembly techniques through which graphene oxide (GO) was deposited on modified polyacrylonitrile (mPAN). The fabricated composite GO/mPAN membranes were applied to dehydrate 1-butanol mixtures by pervaporation. Varying driving forces in the self-assembly techniques induced different GO assembly layer microstructures. XRD results indicated that the GO layer d-spacing varied from 8.3 Å to 11.5 Å. The self-assembly technique with evaporation resulted in a heterogeneous GO layer with loop structures; this layer was shown to be hydrophobic, in contrast to the hydrophilic layer formed from the other two techniques. From the pressure-assisted technique, the composite membrane exhibited exceptional pervaporation performance at 30 C: concentration of water at the permeate side = 99.6 wt% and permeation flux = 2.54 kg m-2 h-1. Moreover, the membrane sustained its operating stability at a high temperature of 70 C: a high water concentration of 99.5 wt% was maintained, and a permeation flux as high as 4.34 kg m-2 h-1 was attained. This excellent separation performance stemmed from the dense, highly ordered laminate structure of GO.

Keywords: graphene oxide, self-assembly, alcohol dehydration, polyacrylonitrile (mPAN)

Procedia PDF Downloads 293
47227 Cantilever Secant Pile Constructed in Sand: Numerical Comparative Study and Design Aids – Part II

Authors: Khaled R. Khater

Abstract:

All civil engineering projects include excavation work and therefore need some retaining structures. Cantilever secant pile walls are an economical supporting system up to 5.0-m depths. The parameters controlling wall tip displacement are the focus of this paper. So, two analysis techniques have been investigated and arbitrated. They are the conventional method and finite element analysis. Accordingly, two computer programs have been used, Excel sheet and Plaxis-2D. Two soil models have been used throughout this study. They are Mohr-Coulomb soil model and Isotropic Hardening soil models. During this study, two soil densities have been considered, i.e. loose and dense sand. Ten wall rigidities have been analyzed covering ranges of perfectly flexible to completely rigid walls. Three excavation depths, i.e. 3.0-m, 4.0-m and 5.0-m were tested to cover the practical range of secant piles. This work submits beneficial hints about secant piles to assist designers and specification committees. Also, finite element analysis, isotropic hardening, is recommended to be the fair judge when two designs conflict. A rational procedure using empirical equations has been suggested to upgrade the conventional method to predict wall tip displacement ‘δ’. Also, a reasonable limitation of ‘δ’ as a function of excavation depth, ‘h’ has been suggested. Also, it has been found that, after a certain penetration depth any further increase of it does not positively affect the wall tip displacement, i.e. over design and uneconomic.

Keywords: design aids, numerical analysis, secant pile, Wall tip displacement

Procedia PDF Downloads 187