Search results for: feature points
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1675

Search results for: feature points

385 Identification, Prediction and Detection of the Process Fault in a Cement Rotary Kiln by Locally Linear Neuro-Fuzzy Technique

Authors: Masoud Sadeghian, Alireza Fatehi

Abstract:

In this paper, we use nonlinear system identification method to predict and detect process fault of a cement rotary kiln. After selecting proper inputs and output, an input-output model is identified for the plant. To identify the various operation points in the kiln, Locally Linear Neuro-Fuzzy (LLNF) model is used. This model is trained by LOLIMOT algorithm which is an incremental treestructure algorithm. Then, by using this method, we obtained 3 distinct models for the normal and faulty situations in the kiln. One of the models is for normal condition of the kiln with 15 minutes prediction horizon. The other two models are for the two faulty situations in the kiln with 7 minutes prediction horizon are presented. At the end, we detect these faults in validation data. The data collected from White Saveh Cement Company is used for in this study.

Keywords: Cement Rotary Kiln, Fault Detection, Delay Estimation Method, Locally Linear Neuro Fuzzy Model, LOLIMOT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
384 Fused Structure and Texture (FST) Features for Improved Pedestrian Detection

Authors: Hussin K. Ragb, Vijayan K. Asari

Abstract:

In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.

Keywords: Pedestrian detection, phase congruency, local phase, LBP features, CSLBP features, FST descriptor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488
383 A Differential Calculus Based Image Steganography with Crossover

Authors: Srilekha Mukherjee, Subha Ash, Goutam Sanyal

Abstract:

Information security plays a major role in uplifting the standard of secured communications via global media. In this paper, we have suggested a technique of encryption followed by insertion before transmission. Here, we have implemented two different concepts to carry out the above-specified tasks. We have used a two-point crossover technique of the genetic algorithm to facilitate the encryption process. For each of the uniquely identified rows of pixels, different mathematical methodologies are applied for several conditions checking, in order to figure out all the parent pixels on which we perform the crossover operation. This is done by selecting two crossover points within the pixels thereby producing the newly encrypted child pixels, and hence the encrypted cover image. In the next lap, the first and second order derivative operators are evaluated to increase the security and robustness. The last lap further ensures reapplication of the crossover procedure to form the final stego-image. The complexity of this system as a whole is huge, thereby dissuading the third party interferences. Also, the embedding capacity is very high. Therefore, a larger amount of secret image information can be hidden. The imperceptible vision of the obtained stego-image clearly proves the proficiency of this approach.

Keywords: Steganography, Crossover, Differential Calculus, Peak Signal to Noise Ratio, Cross-correlation Coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393
382 Statistical Measures and Optimization Algorithms for Gene Selection in Lung and Ovarian Tumor

Authors: C. Gunavathi, K. Premalatha

Abstract:

Microarray technology is universally used in the study of disease diagnosis using gene expression levels. The main shortcoming of gene expression data is that it includes thousands of genes and a small number of samples. Abundant methods and techniques have been proposed for tumor classification using microarray gene expression data. Feature or gene selection methods can be used to mine the genes that directly involve in the classification and to eliminate irrelevant genes. In this paper statistical measures like T-Statistics, Signal-to-Noise Ratio (SNR) and F-Statistics are used to rank the genes. The ranked genes are used for further classification. Particle Swarm Optimization (PSO) algorithm and Shuffled Frog Leaping (SFL) algorithm are used to find the significant genes from the top-m ranked genes. The Naïve Bayes Classifier (NBC) is used to classify the samples based on the significant genes. The proposed work is applied on Lung and Ovarian datasets. The experimental results show that the proposed method achieves 100% accuracy in all the three datasets and the results are compared with previous works.

Keywords: Microarray, T-Statistics, Signal-to-Noise Ratio, FStatistics, Particle Swarm Optimization, Shuffled Frog Leaping, Naïve Bayes Classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1944
381 Mixture Design Experiment on Flow Behaviour of O/W Emulsions as Affected by Polysaccharide Interactions

Authors: Nor Hayati Ibrahim, Yaakob B. Che Man, Chin Ping Tan, Nor Aini Idris

Abstract:

Interaction effects of xanthan gum (XG), carboxymethyl cellulose (CMC), and locust bean gum (LBG) on the flow properties of oil-in-water emulsions were investigated by a mixture design experiment. Blends of XG, CMC and LBG were prepared according to an augmented simplex-centroid mixture design (10 points) and used at 0.5% (wt/wt) in the emulsion formulations. An appropriate mathematical model was fitted to express each response as a function of the proportions of the blend components that are able to empirically predict the response to any blend of combination of the components. The synergistic interaction effect of the ternary XG:CMC:LBG blends at approximately 33-67% XG levels was shown to be much stronger than that of the binary XG:LBG blend at 50% XG level (p < 0.05). Nevertheless, an antagonistic interaction effect became significant as CMC level in blends was more than 33% (p < 0.05). Yield stress and apparent viscosity (at 10 s-1) responses were successfully fitted with a special quartic model while flow behaviour index and consistency coefficient were fitted with a full quartic model (R2 adjusted ≥ 0.90). This study found that a mixture design approach could serve as a valuable tool in better elucidating and predicting the interaction effects beyond the conventional twocomponent blends.

Keywords: O/W emulsions, flow behavior, polysaccharideinteraction, mixture design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2219
380 Comparing Data Analysis, Communication and Information Technologies Expertise Levels in Undergraduate Psychology Students

Authors: Ana Cázares

Abstract:

Aims for this study: first, to compare the expertise level in data analysis, communication and information technologies in undergraduate psychology students. Second, to verify the factor structure of E-ETICA (Escala de Experticia en Tecnologias de la Informacion, la Comunicacion y el Análisis or Data Analysis, Communication and Information'Expertise Scale) which had shown an excellent internal consistency (α= 0.92) as well as a simple factor structure. Three factors, Complex, Basic Information and Communications Technologies and E-Searching and Download Abilities, explains 63% of variance. In the present study, 260 students (119 juniors and 141 seniors) were asked to respond to ETICA (16 items Likert scale of five points 1: null domain to 5: total domain). The results show that both junior and senior students report having very similar expertise level; however, E-ETICA presents a different factor structure for juniors and four factors explained also 63% of variance: Information E-Searching, Download and Process; Data analysis; Organization; and Communication technologies.

Keywords: Data analysis, Information, Communications Technologies, Expertise'Levels.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1285
379 The Importance of Development in Laboratory Diagnosis at the Intersection

Authors: Agus Sahri, Cahya Putra Dinata, Faishal Andhi Rokhman

Abstract:

Intersection is a critical area on a highway which is a place of conflict points and congestion due to the meeting of two or more roads. Conflicts that occur at the intersection include diverging, merging, weaving, and crossing. To deal with these conflicts, a crossing control system is needed, at a plot of intersection there are two control systems namely signal intersections and non-signalized intersections. The control system at a plot of intersection can affect the intersection performance. In Indonesia there are still many intersections with poor intersection performance. In analyzing the parameters to measure the performance of a plot of intersection in Indonesia, it is guided by the 1997 Indonesian Road Capacity Manual. For this reason, this study aims to develop laboratory diagnostics at plot intersections to analyze parameters that can affect the performance of an intersection. The research method used is research and development. The laboratory diagnosis includes anamnesis, differential diagnosis, inspection, diagnosis, prognosis, specimens, analysis and sample data analysts. It is expected that this research can encourage the development and application of laboratory diagnostics at a plot of intersection in Indonesia so that intersections can function optimally.

Keywords: Intersection, laboratory diagnostic, control systems, Indonesia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 751
378 Analysing the Elementary Science and Technology Coursebook and Student Workbook in Terms of Constructivism

Authors: Nil Duban

Abstract:

The curriculum of the primary school science course was redesigned on the basis of constructivism in 2005-2006 academic years, in Turkey. In this context, the name of this course has been changed as “Science and Technology"; and both content and course books, students workbooks for this course have been redesigned in light of constructivism. The aim of this study is to determine whether the Science and Technology course books and student work books for primary school 5th grade are appropriate for the constructivism by evaluating them in terms of the fundamental principles of constructivism. In this study, out of qualitative research methods, documentation technique (i.e. document analysis) is applied; while selecting samples, criterion-sampling is used out of purposeful sampling techniques. When the Science and Technology course book and workbook for the 5th grade in primary education are examined, it is seen that both books complete each other in certain areas. Consequently, it can be claimed that in spite of some inadequate and missing points in the course book and workbook of the primary school Science and Technology course for the 5th grade students, these books are attempted to be designed in terms of the principles of constructivism. To overcome the inadequacies in the books, it can be suggested to redesign them. In addition to them, not to ignore the technology dimension of the course, the activities that encourage the students to prepare projects using technology cycle should be included.

Keywords: Constructivism, coursebooks, science and technology education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
377 Model Order Reduction of Linear Time Variant High Speed VLSI Interconnects using Frequency Shift Technique

Authors: J.V.R.Ravindra, M.B.Srinivas,

Abstract:

Accurate modeling of high speed RLC interconnects has become a necessity to address signal integrity issues in current VLSI design. To accurately model a dispersive system of interconnects at higher frequencies; a full-wave analysis is required. However, conventional circuit simulation of interconnects with full wave models is extremely CPU expensive. We present an algorithm for reducing large VLSI circuits to much smaller ones with similar input-output behavior. A key feature of our method, called Frequency Shift Technique, is that it is capable of reducing linear time-varying systems. This enables it to capture frequency-translation and sampling behavior, important in communication subsystems such as mixers, RF components and switched-capacitor filters. Reduction is obtained by projecting the original system described by linear differential equations into a lower dimension. Experiments have been carried out using Cadence Design Simulator cwhich indicates that the proposed technique achieves more % reduction with less CPU time than the other model order reduction techniques existing in literature. We also present applications to RF circuit subsystems, obtaining size reductions and evaluation speedups of orders of magnitude with insignificant loss of accuracy.

Keywords: Model order Reduction, RLC, crosstalk

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651
376 Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset

Authors: Adrienne Kline, Jaydip Desai

Abstract:

Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands.

Keywords: Brain-machine interface, EEGLAB, emotiv EEG neuroheadset, openViBE, simulink.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2802
375 Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems

Authors: Dr. L. Arockiam, A. Aloysius

Abstract:

In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experiments

Keywords: Software Complexity, Attribute Weighted Class Complexity, Weighted Class Complexity, Data Type

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
374 A Structural Support Vector Machine Approach for Biometric Recognition

Authors: Vishal Awasthi, Atul Kumar Agnihotri

Abstract:

Face is a non-intrusive strong biometrics for identification of original and dummy facial by different artificial means. Face recognition is extremely important in the contexts of computer vision, psychology, surveillance, pattern recognition, neural network, content based video processing. The availability of a widespread face database is crucial to test the performance of these face recognition algorithms. The openly available face databases include face images with a wide range of poses, illumination, gestures and face occlusions but there is no dummy face database accessible in public domain. This paper presents a face detection algorithm based on the image segmentation in terms of distance from a fixed point and template matching methods. This proposed work is having the most appropriate number of nodal points resulting in most appropriate outcomes in terms of face recognition and detection. The time taken to identify and extract distinctive facial features is improved in the range of 90 to 110 sec. with the increment of efficiency by 3%.

Keywords: Face recognition, Principal Component Analysis, PCA, Linear Discriminant Analysis, LDA, Improved Support Vector Machine, iSVM, elastic bunch mapping technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 492
373 Gabriel-constrained Parametric Surface Triangulation

Authors: Oscar E. Ruiz, Carlos Cadavid, Juan G. Lalinde, Ricardo Serrano, Guillermo Peris-Fajarnes

Abstract:

The Boundary Representation of a 3D manifold contains FACES (connected subsets of a parametric surface S : R2 -! R3). In many science and engineering applications it is cumbersome and algebraically difficult to deal with the polynomial set and constraints (LOOPs) representing the FACE. Because of this reason, a Piecewise Linear (PL) approximation of the FACE is needed, which is usually represented in terms of triangles (i.e. 2-simplices). Solving the problem of FACE triangulation requires producing quality triangles which are: (i) independent of the arguments of S, (ii) sensitive to the local curvatures, and (iii) compliant with the boundaries of the FACE and (iv) topologically compatible with the triangles of the neighboring FACEs. In the existing literature there are no guarantees for the point (iii). This article contributes to the topic of triangulations conforming to the boundaries of the FACE by applying the concept of parameterindependent Gabriel complex, which improves the correctness of the triangulation regarding aspects (iii) and (iv). In addition, the article applies the geometric concept of tangent ball to a surface at a point to address points (i) and (ii). Additional research is needed in algorithms that (i) take advantage of the concepts presented in the heuristic algorithm proposed and (ii) can be proved correct.

Keywords: surface triangulation, conforming triangulation, surfacesampling, Gabriel complex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661
372 Online Brands: A Comparative Study of World Top Ranked Universities with Science and Technology Programs

Authors: Zullina H. Shaari, Amzairi Amar, Abdul Mutalib Embong, Hezlina Hashim

Abstract:

University websites are considered as one of the brand primary touch points for multiple stakeholders, but most of them did not have great designs to create favorable impressions. Some of the elements that web designers should carefully consider are the appearance, the content, the functionality, usability and search engine optimization. However, priority should be placed on website simplicity and negative space. In terms of content, previous research suggests that universities should include reputation, learning environment, graduate career prospects, image destination, cultural integration, and virtual tour on their websites. The study examines how top 200 world ranking science and technology-based universities present their brands online and whether the websites capture the content dimensions. Content analysis of the websites revealed that the top ranking universities captured these dimensions at varying degree. Besides, the UK-based university had better priority on website simplicity and negative space compared to the Malaysian-based university.

Keywords: Science and technology programs, top-ranked universities, online brands, university websites.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2290
371 Chinese Language Teaching as a Second Language: Immersion Teaching

Authors: Lee Bih Ni, Kiu Su Na

Abstract:

This paper discusses the Chinese Language Teaching as a Second Language by focusing on Immersion Teaching. Researchers used narrative literature review to describe the current states of both art and science in focused areas of inquiry. Immersion teaching comes with a standard that teachers must reliably meet. Chinese language-immersion instruction consists of language and content lessons, including functional usage of the language, academic language, authentic language, and correct Chinese sociocultural language. Researchers used narrative literature reviews to build a scientific knowledge base. Researchers collected all the important points of discussion, and put them here with reference to the specific field where this paper is originally based on. The findings show that Chinese Language in immersion teaching is not like standard foreign language classroom; immersion setting provides more opportunities to teach students colloquial language than academic. Immersion techniques also introduce a language’s cultural and social contexts in a meaningful and memorable way. It is particularly important that immersion teachers connect classwork with real-life experiences. Immersion also includes more elements of discovery and inquiry based learning than do other kinds of instructional practices. Students are always and consistently interpreted the conclusions and context clues.

Keywords: A second language, Chinese language teaching, immersion teaching, instructional strategies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2180
370 Using Satellite Images Datasets for Road Intersection Detection in Route Planning

Authors: Fatma El-zahraa El-taher, Ayman Taha, Jane Courtney, Susan Mckeever

Abstract:

Understanding road networks plays an important role in navigation applications such as self-driving vehicles and route planning for individual journeys. Intersections of roads are essential components of road networks. Understanding the features of an intersection, from a simple T-junction to larger multi-road junctions is critical to decisions such as crossing roads or selecting safest routes. The identification and profiling of intersections from satellite images is a challenging task. While deep learning approaches offer state-of-the-art in image classification and detection, the availability of training datasets is a bottleneck in this approach. In this paper, a labelled satellite image dataset for the intersection recognition  problem is presented. It consists of 14,692 satellite images of Washington DC, USA. To support other users of the dataset, an automated download and labelling script is provided for dataset replication. The challenges of construction and fine-grained feature labelling of a satellite image dataset are examined, including the issue of how to address features that are spread across multiple images. Finally, the accuracy of detection of intersections in satellite images is evaluated.

Keywords: Satellite images, remote sensing images, data acquisition, autonomous vehicles, robot navigation, route planning, road intersections.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 754
369 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model

Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin

Abstract:

Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.

Keywords: Anomaly detection, autoencoder, data centers, deep learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741
368 The Effectiveness of University’s Strategic Plan for Sustainability through Collaborative Platform’s Deliberation Matrix

Authors: Ashiquer Rahman

Abstract:

The paper focuses on the significance of the university's sustainability strategic plan and emphasizes the usefulness of the collaborative platform-based deliberation matrix. It will equip the university's leadership to handle impending tactics and challenges with the sustainability of the university’s strategic plan. The study addresses the significance of a set of reference points that will precede operational activities for multi-stakeholder multi-criteria evaluation on the optimal standards of Sustainable University, as well as potential action for the strategic blueprint of Sustainable University. It makes reference to the university’s sustainability strategy plan’s effectiveness through a collaborative platform and deliberation matrix. The paper outlines the conceptual framing of a sustainable university by implementing a strategic plan over the collaborative platform and deliberation matrix. Optimistically, these will be a milestone in higher education; a pathway to prepare for the University’s upcoming implementation of its sustainability strategy. In fact, the collaborative platform and deliberation matrix both are enhancement needles for institutional cooperation to the completive world.

Keywords: Sustainable strategies, institutional cooperation, multi-stakeholder multi-criteria assessment, collaborative platform, innovative method and tools.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 79
367 The Impact of Modeling Method of Moisture Emission from the Swimming Pool on the Accuracy of Numerical Calculations of Air Parameters in Ventilated Natatorium

Authors: Piotr Ciuman, Barbara Lipska

Abstract:

The aim of presented research was to improve numerical predictions of air parameters distribution in the actual natatorium by the selection of calculation formula of mass flux of moisture emitted from the pool. Selected correlation should ensure the best compliance of numerical results with the measurements' results of these parameters in the facility. The numerical model of the natatorium was developed, for which boundary conditions were prepared on the basis of measurements' results carried out in the actual facility. Numerical calculations were carried out with the use of ANSYS CFX software, with six formulas being implemented, which in various ways made the moisture emission dependent on water surface temperature and air parameters in the natatorium. The results of calculations with the use of these formulas were compared for air parameters' distributions: Specific humidity, velocity and temperature in the facility. For the selection of the best formula, numerical results of these parameters in occupied zone were validated by comparison with the measurements' results carried out at selected points of this zone.

Keywords: Experimental validation, indoor swimming pool, moisture emission, natatorium, numerical calculations, CFD, thermal and humidity conditions, ventilation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
366 Developing a New Relationship between Undrained Shear Strength and Over-Consolidation Ratio

Authors: Wael M Albadri, Hassnen M Jafer, Ehab H Sfoog

Abstract:

Relationship between undrained shear strength (Su) and over consolidation ratio (OCR) of clay soil (marine clay) is very important in the field of geotechnical engineering to estimate the settlement behaviour of clay and to prepare a small scale physical modelling test. In this study, a relationship between shear strength and OCR parameters was determined using the laboratory vane shear apparatus and the fully automatic consolidated apparatus. The main objective was to establish non-linear correlation formula between shear strength and OCR and comparing it with previous studies. Therefore, in order to achieve this objective, three points were chosen to obtain 18 undisturbed samples which were collected with an increasing depth of 1.0 m to 3.5 m each 0.5 m. Clay samples were prepared under undrained condition for both tests. It was found that the OCR and shear strength are inversely proportional at similar depth and at same undrained conditions. However, a good correlation was obtained from the relationships where the R2 values were very close to 1.0 using polynomial equations. The comparison between the experimental result and previous equation from other researchers produced a non-linear correlation which has a similar pattern with this study.

Keywords: Shear strength, over-consolidation ratio, vane shear test, clayey soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2144
365 On the Variability of Tool Wear and Life at Disparate Operating Parameters

Authors: S. E. Oraby, A.M. Alaskari

Abstract:

The stochastic nature of tool life using conventional discrete-wear data from experimental tests usually exists due to many individual and interacting parameters. It is a common practice in batch production to continually use the same tool to machine different parts, using disparate machining parameters. In such an environment, the optimal points at which tools have to be changed, while achieving minimum production cost and maximum production rate within the surface roughness specifications, have not been adequately studied. In the current study, two relevant aspects are investigated using coated and uncoated inserts in turning operations: (i) the accuracy of using machinability information, from fixed parameters testing procedures, when variable parameters situations are emerged, and (ii) the credibility of tool life machinability data from prior discrete testing procedures in a non-stop machining. A novel technique is proposed and verified to normalize the conventional fixed parameters machinability data to suit the cases when parameters have to be changed for the same tool. Also, an experimental investigation has been established to evaluate the error in the tool life assessment when machinability from discrete testing procedures is employed in uninterrupted practical machining.

Keywords: Machinability, tool life, tool wear, wear variability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
364 Design and Development of 5-DOF Color Sorting Manipulator for Industrial Applications

Authors: Atef. A. Ata, Sohair F. Rezeka, Ahmed El-Shenawy, Mohammed Diab

Abstract:

Image processing in today’s world grabs massive attentions as it leads to possibilities of broaden application in many fields of high technology. The real challenge is how to improve existing sorting system applications which consists of two integrated stations of processing and handling with a new image processing feature. Existing color sorting techniques use a set of inductive, capacitive, and optical sensors to differentiate object color. This research presents a mechatronic color sorting system solution with the application of image processing. A 5-DOF robot arm is designed and developed with pick and place operation to act as the main part of the color sorting system. Image processing procedure senses the circular objects in an image captured in real time by a webcam fixed at the end-effector then extracts color and position information out of it. This information is passed as a sequence of sorting commands to the manipulator that has pick-and-place mechanism. Performance analysis proves that this color based object sorting system works accurately under ideal condition in term of adequate illumination, circular objects shape and color. The circular objects tested for sorting are red, green and blue. For non-ideal condition, such as unspecified color the accuracy reduces to 80%.

Keywords: Robotics manipulator, 5-DOF manipulator, image processing, Color sorting, Pick-and-place.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4216
363 Analysis Model for the Relationship of Users, Products, and Stores on Online Marketplace Based on Distributed Representation

Authors: Ke He, Wumaier Parezhati, Haruka Yamashita

Abstract:

Recently, online marketplaces in the e-commerce industry, such as Rakuten and Alibaba, have become some of the most popular online marketplaces in Asia. In these shopping websites, consumers can select purchase products from a large number of stores. Additionally, consumers of the e-commerce site have to register their name, age, gender, and other information in advance, to access their registered account. Therefore, establishing a method for analyzing consumer preferences from both the store and the product side is required. This study uses the Doc2Vec method, which has been studied in the field of natural language processing. Doc2Vec has been used in many cases to analyze the extraction of semantic relationships between documents (represented as consumers) and words (represented as products) in the field of document classification. This concept is applicable to represent the relationship between users and items; however, the problem is that one more factor (i.e., shops) needs to be considered in Doc2Vec. More precisely, a method for analyzing the relationship between consumers, stores, and products is required. The purpose of our study is to combine the analysis of the Doc2vec model for users and shops, and for users and items in the same feature space. This method enables the calculation of similar shops and items for each user. In this study, we derive the real data analysis accumulated in the online marketplace and demonstrate the efficiency of the proposal.

Keywords: Doc2Vec, marketing, online marketplace, recommendation system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 465
362 Comparison of Proportional Control and Fuzzy Logic Control to Develop an Ideal Thermoelectric Renal Hypothermia System

Authors: Hakan Işık, Esra Saraçoğlu

Abstract:

In this study, a comparison of two control methods, Proportional Control (PC) and Fuzzy Logic Control (FLC), which have been used to develop an ideal thermoelectric renal hypothermia system in order to use in renal surgery, has been carried out. Since the most important issues in long-lasting parenchymatous renal surgery are to provide an operation medium free of blood and to prevent renal dysfunction in the postoperative period, control of the temperature has become very important in renal surgery. The final product is seriously affected from the changes in temperature, therefore, it is necessary to reach some desired temperature points quickly and avoid large overshoot. PIC16F877 microcontroller has been used as controller for both of these two methods. Each control method can simply ensure extra renal hypothermia in the targeted way. But investigation of advantages and disadvantages of every control method to each other is aimed and carried out by the experimental implementations. Shortly, investigation of the most appropriate method to use for development of system and that can be applied to people safely in the future, has been performed. In this sense, experimental results show that fuzzy logic control gives out more reliable responses and efficient performance.

Keywords: renal hypothermia, renal cooling, temperature control, proportional control fuzzy logic control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1460
361 Developing a Sustainable Educational Portal for the D-Grid Community

Authors: Viktor Achter, Sebastian Breuers, Marc Seifert, Ulrich Lang, Joachim Götze, Bernd Reuther, Paul Müller

Abstract:

Within the last years, several technologies have been developed to help building e-learning portals. Most of them follow approaches that deliver a vast amount of functionalities, suitable for class-like learning. The SuGI project, as part of the D-Grid (funded by the BMBF), targets on delivering a highly scalable and sustainable learning solution to provide materials (e.g. learning modules, training systems, webcasts, tutorials, etc.) containing knowledge about Grid computing to the D-Grid community. In this article, the process of the development of an e-learning portal focused on the requirements of this special user group is described. Furthermore, it deals with the conceptual and technical design of an e-learning portal, addressing the special needs of heterogeneous target groups. The main focus lies on the quality management of the software development process, Web templates for uploading new contents, the rich search and filter functionalities which will be described from a conceptual as well as a technical point of view. Specifically, it points out best practices as well as concepts to provide a sustainable solution to a relatively unknown and highly heterogeneous community.

Keywords: D-Grid, e-learning, e-science, Grid computing, SuGI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1343
360 Enhancing Seismic Performance of Ductile Moment Frames with Delayed Wire-Rope Bracing Using Middle Steel Plate

Authors: Babak Dizangian, Mohammad Reza Ghasemi, Akram Ghalandari

Abstract:

Moment frames have considerable ductility against cyclic lateral loads and displacements; however, if this feature causes the relative displacement to exceed the permissible limit, it can impose unfavorable hysteretic behavior on the frame. Therefore, adding a bracing system with the capability of preserving the capacity of high energy absorption and controlling displacements without a considerable increase in the stiffness is quite important. This paper investigates the retrofitting of a single storey steel moment frame through a delayed wire-rope bracing system using a middle steel plate. In this model, the steel plate lies where the wire ropes meet, and the model geometry is such that the cables are continuously under tension so that they can take the most advantage of the inherent potential they have in tolerating tensile stress. Using the steel plate also reduces the system stiffness considerably compared to cross bracing systems and preserves the ductile frame’s energy absorption capacity. In this research, the software models of delayed wire-rope bracing system have been studied, validated, and compared with other researchers’ laboratory test results.

Keywords: Ductile moment frame, delayed wire rope bracing, cyclic loading, hysteresis curve, energy absorption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 973
359 Robotic Assistance in Nursing Care: Survey on Challenges and Scenarios

Authors: Pascal Gliesche, Kathrin Seibert, Christian Kowalski, Dominik Domhoff, Max Pfingsthorn, Karin Wolf-Ostermann, Andreas Hein

Abstract:

Robotic assistance in nursing care is an increasingly important area of research and development. Facing a shortage of labor and an increasing number of people in need of care, the German Nursing Care Innovation Center (Pflegeinnovationszentrum, PIZ) aims to address these challenges from the side of technology. Little is known about nurses experiences with existing robotic assistance systems. Especially nurses perspectives on starting points for the development of robotic solutions, that target recurring burdensome tasks in everyday nursing care, are of interest. This paper presents findings focusing on robotics resulting from an explanatory mixed-methods study on nurses experiences with and their expectations for innovative technologies in nursing care in stationary and ambulant care facilities and hospitals in Germany. Based on the findings, eight scenarios for robotic assistance are identified based on the real needs of practitioners. An initial system addressing a single use-case is described to show perspectives for the use of robots in nursing care.

Keywords: Robotics and automation, engineering management, engineering in medicine and biology, medical services, public healthcare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2233
358 A Real Time Comparison of Standalone and Grid Connected Solar Photovoltaic Generation Systems

Authors: Sachin Vrajlal Rajani, Vivek Pandya, Ankit Suvariya

Abstract:

Green and renewable energy is getting extraordinary consideration today, because of ecological concerns made by blazing of fossil powers. Photovoltaic and wind power generation are the basic decisions for delivering power in this respects. Producing power by the sun based photovoltaic systems is known to the world, yet control makers may get confounded to pick between on-grid and off-grid systems. In this exploration work, an endeavor is made to compare the off-grid (stand-alone) and on-grid (grid-connected) frameworks. The work presents relative examination, between two distinctive PV frameworks situated at V.V.P. Engineering College, Rajkot. The first framework is 100 kW remain solitary and the second is 60 kW network joined. The real-time parameters compared are; output voltage, load current, power in-flow, power output, performance ratio, yield factor, and capacity factor. The voltage changes and the power variances in both frameworks are given exceptional consideration and the examination is made between the two frameworks to judge the focal points and confinements of both the frameworks.

Keywords: Standalone PV systems, grid connected PV systems, comparison, real time data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3085
357 A New Method in Detection of Ceramic Tiles Color Defects Using Genetic C-Means Algorithm

Authors: Mahkameh S. Mostafavi

Abstract:

In this paper an algorithm is used to detect the color defects of ceramic tiles. First the image of a normal tile is clustered using GCMA; Genetic C-means Clustering Algorithm; those results in best cluster centers. C-means is a common clustering algorithm which optimizes an objective function, based on a measure between data points and the cluster centers in the data space. Here the objective function describes the mean square error. After finding the best centers, each pixel of the image is assigned to the cluster with closest cluster center. Then, the maximum errors of clusters are computed. For each cluster, max error is the maximum distance between its center and all the pixels which belong to it. After computing errors all the pixels of defected tile image are clustered based on the centers obtained from normal tile image in previous stage. Pixels which their distance from their cluster center is more than the maximum error of that cluster are considered as defected pixels.

Keywords: C-Means algorithm, color spaces, Genetic Algorithm, image clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
356 Enhancing Cache Performance Based on Improved Average Access Time

Authors: Jasim. A. Ghaeb

Abstract:

A high performance computer includes a fast processor and millions bytes of memory. During the data processing, huge amount of information are shuffled between the memory and processor. Because of its small size and its effectiveness speed, cache has become a common feature of high performance computers. Enhancing cache performance proved to be essential in the speed up of cache-based computers. Most enhancement approaches can be classified as either software based or hardware controlled. The performance of the cache is quantified in terms of hit ratio or miss ratio. In this paper, we are optimizing the cache performance based on enhancing the cache hit ratio. The optimum cache performance is obtained by focusing on the cache hardware modification in the way to make a quick rejection to the missed line's tags from the hit-or miss comparison stage, and thus a low hit time for the wanted line in the cache is achieved. In the proposed technique which we called Even- Odd Tabulation (EOT), the cache lines come from the main memory into cache are classified in two types; even line's tags and odd line's tags depending on their Least Significant Bit (LSB). This division is exploited by EOT technique to reject the miss match line's tags in very low time compared to the time spent by the main comparator in the cache, giving an optimum hitting time for the wanted cache line. The high performance of EOT technique against the familiar mapping technique FAM is shown in the simulated results.

Keywords: Caches, Cache performance, Hit time, Cache hit ratio, Cache mapping, Cache memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677