Search results for: Exploratory data message
6358 Autonomously Determining the Parameters for SVDD with RBF Kernel from a One-Class Training Set
Authors: Andreas Theissler, Ian Dear
Abstract:
The one-class support vector machine “support vector data description” (SVDD) is an ideal approach for anomaly or outlier detection. However, for the applicability of SVDD in real-world applications, the ease of use is crucial. The results of SVDD are massively determined by the choice of the regularisation parameter C and the kernel parameter of the widely used RBF kernel. While for two-class SVMs the parameters can be tuned using cross-validation based on the confusion matrix, for a one-class SVM this is not possible, because only true positives and false negatives can occur during training. This paper proposes an approach to find the optimal set of parameters for SVDD solely based on a training set from one class and without any user parameterisation. Results on artificial and real data sets are presented, underpinning the usefulness of the approach.
Keywords: Support vector data description, anomaly detection, one-class classification, parameter tuning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29366357 Using Combination of Optimized Recurrent Neural Network with Design of Experiments and Regression for Control Chart Forecasting
Authors: R. Behmanesh, I. Rahimi
Abstract:
recurrent neural network (RNN) is an efficient tool for modeling production control process as well as modeling services. In this paper one RNN was combined with regression model and were employed in order to be checked whether the obtained data by the model in comparison with actual data, are valid for variable process control chart. Therefore, one maintenance process in workshop of Esfahan Oil Refining Co. (EORC) was taken for illustration of models. First, the regression was made for predicting the response time of process based upon determined factors, and then the error between actual and predicted response time as output and also the same factors as input were used in RNN. Finally, according to predicted data from combined model, it is scrutinized for test values in statistical process control whether forecasting efficiency is acceptable. Meanwhile, in training process of RNN, design of experiments was set so as to optimize the RNN.Keywords: RNN, DOE, regression, control chart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16596356 Action Recognition in Video Sequences using a Mealy Machine
Authors: L. Rodriguez-Benitez, J. Moreno-Garcia, J.J. Castro-Schez, C. Solana, L. Jimenez
Abstract:
In this paper the use of sequential machines for recognizing actions taken by the objects detected by a general tracking algorithm is proposed. The system may deal with the uncertainty inherent in medium-level vision data. For this purpose, fuzzification of input data is performed. Besides, this transformation allows to manage data independently of the tracking application selected and enables adding characteristics of the analyzed scenario. The representation of actions by means of an automaton and the generation of the input symbols for finite automaton depending on the object and action compared are described. The output of the comparison process between an object and an action is a numerical value that represents the membership of the object to the action. This value is computed depending on how similar the object and the action are. The work concludes with the application of the proposed technique to identify the behavior of vehicles in road traffic scenes.
Keywords: Approximate reasoning, finite state machines, video analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16866355 Nonlinear Thermal Expansion Model for SiC/Al
Authors: T.R. Sahroni, S. Sulaiman, I. Romli, M.R. Salleh, H.A. Ariff
Abstract:
The thermal expansion behaviour of silicon carbide (SCS-2) fibre reinforced 6061 aluminium matrix composite subjected to the influenced thermal mechanical cycling (TMC) process were investigated. The thermal stress has important effect on the longitudinal thermal expansion coefficient of the composites. The present paper used experimental data of the thermal expansion behaviour of a SiC/Al composite for temperatures up to 370°C, in which their data was used for carrying out modelling of theoretical predictions.Keywords: Nonlinear, thermal, fibre reinforced, metal matrixcomposites
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27026354 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe
Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis
Abstract:
The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.Keywords: Terrain builder, WebGL, virtual globe, CesiumJS, tiled map service, TMS, height-map, regular grid, Geovisual analytics, DTM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23996353 Optimal Aggregate Production Planning with Fuzzy Data
Authors: Wen-Lung Huang, Shih-Pin Chen
Abstract:
This paper investigates the optimization problem of multi-product aggregate production planning (APP) with fuzzy data. From a comprehensive viewpoint of conserving the fuzziness of input information, this paper proposes a method that can completely describe the membership function of the performance measure. The idea is based on the well-known Zadeh-s extension principle which plays an important role in fuzzy theory. In the proposed solution procedure, a pair of mathematical programs parameterized by possibility level a is formulated to calculate the bounds of the optimal performance measure at a . Then the membership function of the optimal performance measure is constructed by enumerating different values of a . Solutions obtained from the proposed method contain more information, and can offer more chance to achieve the feasible disaggregate plan. This is helpful to the decision-maker in practical applications.Keywords: fuzzy data, aggregate production planning, membership function, parametric programming
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17446352 Course Adoption of MS Technologies – Case Study
Authors: Lilac Al Safadi, Rana Abu Nafesa, Regina Garcia
Abstract:
Motivated by Microsoft Co. Academic Program initiative, the department of Information Technology in King Saud University has adopted Microsoft products in three courses. The initiative aimed at enhancing the abilities of the university graduates and equipping them with skills that would help them in the job market. A number of methods of collecting assessment data were used to evaluate the course adoption initiative. Assessment data indicated that the goal of the course adoption is being achieved and that the students were much better prepared to design applications and administrate networks.Keywords: course adoption, assessment, programming, technologies
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13026351 A Review of Lortie’s Schoolteacher
Authors: Tsai-Hsiu Lin
Abstract:
Dan C. Lortie’s Schoolteacher: A sociological study is one of the best works on the sociology of teaching since W. Waller’s classic study. It is a book worthy of review. Following the tradition of symbolic interactionists, Lortie demonstrated the qualities who studied the occupation of teaching. Using several methods to gather effective data, Lortie has portrayed the ethos of the teaching profession. Therefore, the work is an important book on the teaching profession and teacher culture. Though outstanding, Lortie’s work is also flawed in that his perspectives and methodology were adopted largely from symbolic interactionism. First, Lortie in his work analyzed many points regarding teacher culture; for example, he was interested in exploring “sentiment,” “cathexis,” and “ethos.” Thus, he was more a psychologist than a sociologist. Second, symbolic interactionism led him to discern the teacher culture from a micro view, thereby missing the structural aspects. For example, he did not fully discuss the issue of gender and he ignored the issue of race. Finally, following the qualitative sociological tradition, Lortie employed many qualitative methods to gather data but only foucused on obtaining and presenting interview data. Moreover, he used measurement methods that were too simplistic for analyzing quantitative data fully.Keywords: Lortie’s Schooltacher, Symbolic interactionism, teacher culture, teaching profession.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42396350 Ezilla Cloud Service with Cassandra Database for Sensor Observation System
Authors: Kuo-Yang Cheng, Yi-Lun Pan, Chang-Hsing Wu, His-En Yu, Hui-Shan Chen, Weicheng Huang
Abstract:
The main mission of Ezilla is to provide a friendly interface to access the virtual machine and quickly deploy the high performance computing environment. Ezilla has been developed by Pervasive Computing Team at National Center for High-performance Computing (NCHC). Ezilla integrates the Cloud middleware, virtualization technology, and Web-based Operating System (WebOS) to form a virtual computer in distributed computing environment. In order to upgrade the dataset and speedup, we proposed the sensor observation system to deal with a huge amount of data in the Cassandra database. The sensor observation system is based on the Ezilla to store sensor raw data into distributed database. We adopt the Ezilla Cloud service to create virtual machines and login into virtual machine to deploy the sensor observation system. Integrating the sensor observation system with Ezilla is to quickly deploy experiment environment and access a huge amount of data with distributed database that support the replication mechanism to protect the data security.Keywords: Cloud, Virtualization, Cassandra, WebOS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18706349 Application of Life Data Analysis for the Reliability Assessment of Numerical Overcurrent Relays
Authors: Mohd Iqbal Ridwan, Kerk Lee Yen, Aminuddin Musa, Bahisham Yunus
Abstract:
Protective relays are components of a protection system in a power system domain that provides decision making element for correct protection and fault clearing operations. Failure of the protection devices may reduce the integrity and reliability of the power system protection that will impact the overall performance of the power system. Hence it is imperative for power utilities to assess the reliability of protective relays to assure it will perform its intended function without failure. This paper will discuss the application of reliability analysis using statistical method called Life Data Analysis in Tenaga Nasional Berhad (TNB), a government linked power utility company in Malaysia, namely Transmission Division, to assess and evaluate the reliability of numerical overcurrent protective relays from two different manufacturers.Keywords: Life data analysis, Protective relays, Reliability, Weibull Distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39826348 DSLEP (Data Structure Learning Platform to Aid in Higher Education IT Courses)
Authors: Estevan B. Costa, Armando M. Toda, Marcell A. A. Mesquita, Jacques D. Brancher
Abstract:
The advances in technology in the last five years allowed an improvement in the educational area, as the increasing in the development of educational software. One of the techniques that emerged in this lapse is called Gamification, which is the utilization of video game mechanics outside its bounds. Recent studies involving this technique provided positive results in the application of these concepts in many areas as marketing, health and education. In the last area there are studies that covers from elementary to higher education, with many variations to adequate to the educators methodologies. Among higher education, focusing on IT courses, data structures are an important subject taught in many of these courses, as they are base for many systems. Based on the exposed this paper exposes the development of an interactive web learning environment, called DSLEP (Data Structure Learning Platform), to aid students in higher education IT courses. The system includes basic concepts seen on this subject such as stacks, queues, lists, arrays, trees and was implemented to ease the insertion of new structures. It was also implemented with gamification concepts, such as points, levels, and leader boards, to engage students in the search for knowledge and stimulate self-learning.
Keywords: Gamification, Interactive learning environment, Data structures, e-learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24366347 Computing the Loop Bound in Iterative Data Flow Graphs Using Natural Token Flow
Authors: Ali Shatnawi
Abstract:
Signal processing applications which are iterative in nature are best represented by data flow graphs (DFG). In these applications, the maximum sampling frequency is dependent on the topology of the DFG, the cyclic dependencies in particular. The determination of the iteration bound, which is the reciprocal of the maximum sampling frequency, is critical in the process of hardware implementation of signal processing applications. In this paper, a novel technique to compute the iteration bound is proposed. This technique is different from all previously proposed techniques, in the sense that it is based on the natural flow of tokens into the DFG rather than the topology of the graph. The proposed algorithm has lower run-time complexity than all known algorithms. The performance of the proposed algorithm is illustrated through analytical analysis of the time complexity, as well as through simulation of some benchmark problems.Keywords: Data flow graph, Iteration period bound, Rateoptimalscheduling, Recursive DSP algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25626346 Performance of Random Diagonal Codes for Spectral Amplitude Coding Optical CDMA Systems
Authors: Hilal A. Fadhil, Syed A. Aljunid, R. Badlishah Ahmed
Abstract:
In this paper we study the use of a new code called Random Diagonal (RD) code for Spectral Amplitude Coding (SAC) optical Code Division Multiple Access (CDMA) networks, using Fiber Bragg-Grating (FBG), FBG consists of a fiber segment whose index of reflection varies periodically along its length. RD code is constructed using code level and data level, one of the important properties of this code is that the cross correlation at data level is always zero, which means that Phase intensity Induced Phase (PIIN) is reduced. We find that the performance of the RD code will be better than Modified Frequency Hopping (MFH) and Hadamard code It has been observed through experimental and theoretical simulation that BER for RD code perform significantly better than other codes. Proof –of-principle simulations of encoding with 3 channels, and 10 Gbps data transmission have been successfully demonstrated together with FBG decoding scheme for canceling the code level from SAC-signal.Keywords: FBG, MFH, OCDMA, PIIN, BER.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17436345 Urban Areas Management in Developing Countries: Analysis of the Urban Areas Crossed with Risk of Storm Water Drains, Aswan-Egypt
Authors: Omar Hamdy, Schichen Zhao, Hussein Abd El-Atty, Ayman Ragab, Muhammad Salem
Abstract:
One of the most risky areas in Aswan is Abouelreesh, which is suffering from flood disasters, as heavy deluge inundates urban areas causing considerable damage to buildings and infrastructure. Moreover, the main problem was the urban sprawl towards this risky area. This paper aims to identify the urban areas located in the risk areas prone to flash floods. Analyzing this phenomenon needs a lot of data to ensure satisfactory results; however, in this case the official data and field data were limited, and therefore, free sources of satellite data were used. This paper used ArcGIS tools to obtain the storm water drains network by analyzing DEM files. Additionally, historical imagery in Google Earth was studied to determine the age of each building. The last step was to overlay the urban area layer and the storm water drains layer to identify the vulnerable areas. The results of this study would be helpful to urban planners and government officials to make the disasters risk estimation and develop primary plans to recover the risky area, especially urban areas located in torrents.
Keywords: Risk area, DEM, storm water drains, GIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9266344 Power Efficient OFDM Signals with Reduced Symbol's Aperiodic Autocorrelation
Authors: Ibrahim M. Hussain
Abstract:
Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.
Keywords: Aperiodic autocorrelation, OFDM, PAPR, SLM, wireless communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17226343 Determining Fire Resistance of Wooden Construction Elements through Experimental Studies and Artificial Neural Network
Authors: Sakir Tasdemir, Mustafa Altin, Gamze Fahriye Pehlivan, Ismail Saritas, Sadiye Didem Boztepe Erkis, Selma Tasdemir
Abstract:
Artificial intelligence applications are commonly used in industry in many fields in parallel with the developments in the computer technology. In this study, a fire room was prepared for the resistance of wooden construction elements and with the mechanism here, the experiments of polished materials were carried out. By utilizing from the experimental data, an artificial neural network (ANN) was modelled in order to evaluate the final cross sections of the wooden samples remaining from the fire. In modelling, experimental data obtained from the fire room were used. In the developed system, the first weight of samples (ws-gr), preliminary cross-section (pcs-mm2), fire time (ft-minute), and fire temperature (t-oC) as input parameters and final cross-section (fcs-mm2) as output parameter were taken. When the results obtained from ANN and experimental data are compared after making statistical analyses, the data of two groups are determined to be coherent and seen to have no meaning difference between them. As a result, it is seen that ANN can be safely used in determining cross sections of wooden materials after fire and it prevents many disadvantages.
Keywords: Artificial neural network, final cross-section, fire retardant polishes, fire safety, wood resistance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19596342 Potential of Detailed Environmental Data Produced by Information and Communication Technology Tools for Better Consideration of Microclimatology Issues in Urban Planning to Promote Active Mobility
Authors: Živa Ravnikar, Alfonso Bahillo Martinez, Barbara Goličnik Marušić
Abstract:
Climate change mitigation has been formally adopted and announced by countries over the globe, where cities are targeting carbon neutrality through various more or less successful, systematic, and fragmentary actions. The article is based on the fact that environmental conditions affect human comfort and the usage of space. Urban planning can, with its sustainable solutions, not only support climate mitigation in terms of a planet reduction of global warming but as well enabling natural processes that in the immediate vicinity produce environmental conditions that encourage people to walk or cycle. However, the article draws attention to the importance of integrating climate consideration into urban planning, where detailed environmental data play a key role, enabling urban planners to improve or monitor environmental conditions on cycle paths. In a practical aspect, this paper tests a particular ICT tool, a prototype used for environmental data. Data gathering was performed along the cycling lanes in Ljubljana (Slovenia), where the main objective was to assess the tool's data applicable value within the planning of comfortable cycling lanes. The results suggest that such transportable devices for in-situ measurements can help a researcher interpret detailed environmental information, characterized by fine granularity and precise data spatial and temporal resolution. Data can be interpreted within human comfort zones, where graphical representation is in the form of a map, enabling the link of the environmental conditions with a spatial context. The paper also provides preliminary results in terms of the potential of such tools for identifying the correlations between environmental conditions and different spatial settings, which can help urban planners to prioritize interventions in places. The paper contributes to multidisciplinary approaches as it demonstrates the usefulness of such fine-grained data for better consideration of microclimatology in urban planning, which is a prerequisite for creating climate-comfortable cycling lanes promoting active mobility.
Keywords: Information and communication technology tools, urban planning, human comfort, microclimate, cycling lanes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4886341 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods
Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin
Abstract:
Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.
Keywords: Burgers’ equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5166340 Image Ranking to Assist Object Labeling for Training Detection Models
Authors: Tonislav Ivanov, Oleksii Nedashkivskyi, Denis Babeshko, Vadim Pinskiy, Matthew Putman
Abstract:
Training a machine learning model for object detection that generalizes well is known to benefit from a training dataset with diverse examples. However, training datasets usually contain many repeats of common examples of a class and lack rarely seen examples. This is due to the process commonly used during human annotation where a person would proceed sequentially through a list of images labeling a sufficiently high total number of examples. Instead, the method presented involves an active process where, after the initial labeling of several images is completed, the next subset of images for labeling is selected by an algorithm. This process of algorithmic image selection and manual labeling continues in an iterative fashion. The algorithm used for the image selection is a deep learning algorithm, based on the U-shaped architecture, which quantifies the presence of unseen data in each image in order to find images that contain the most novel examples. Moreover, the location of the unseen data in each image is highlighted, aiding the labeler in spotting these examples. Experiments performed using semiconductor wafer data show that labeling a subset of the data, curated by this algorithm, resulted in a model with a better performance than a model produced from sequentially labeling the same amount of data. Also, similar performance is achieved compared to a model trained on exhaustive labeling of the whole dataset. Overall, the proposed approach results in a dataset that has a diverse set of examples per class as well as more balanced classes, which proves beneficial when training a deep learning model.Keywords: Computer vision, deep learning, object detection, semiconductor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8306339 Measuring the Level of Housing Defects in the Build-Then-Sell Housing Delivery System
Authors: S. N. F. Mohd Fauzi, N. Yusof, N. Zainul Abidin
Abstract:
When the Malaysian government announced the implementation of the Build-Then-Sell (BTS) system in 2007, the proponents of the BTS have argued that the implementation of this new system may provide houses with low defects. However, there has been no empirical data to support their argument. Therefore, this study is conducted to measure the level of housing defects in the BTS housing delivery system. A survey was conducted to the occupiers in six BTS residential areas. The BTS residential areas have been identified through the media and because of the small number of population, all households in the BTS residential areas were required to participate in the study to enable the researcher to collect the data concerning defects. Questionnaire had been employed as the data collection instrument and was distributed to the respondents of this study. The result has shown that the level of defects in the BTS houses is low, as the rate of defects for all elements are slight. Such low level of defects has apparently only affected the aesthetic value of the houses.
Keywords: Build-Then-Sell houses, housing defects, residentialareas, occupiers
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19246338 Students’ Awareness of the Use of Poster, Power Point and Animated Video Presentations: A Case Study of Third Year Students of the Department of English of Batna University
Authors: Bahloul Amel
Abstract:
The present study debates students’ perceptions of the use of technology in learning English as a Foreign Language. Its aim is to explore and understand students’ preparation and presentation of Posters, PowerPoint and Animated Videos by drawing attention to visual and oral elements. The data is collected through observations and semi-structured interviews and analyzed through phenomenological data analysis steps. The themes emerged from the data, visual learning satisfaction in using information and communication technology, providing structure to oral presentation, learning from peers’ presentations, draw attention to using Posters, PowerPoint and Animated Videos as each supports visual learning and organization of thoughts in oral presentations.
Keywords: Animated Videos, EFL, Posters, PowerPoint presentations, Visual Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30496337 Ion Thruster Grid Lifetime Assessment Based on Its Structural Failure
Authors: Juan Li, Jiawen Qiu, Yuchuan Chu, Tianping Zhang, Wei Meng, Yanhui Jia, Xiaohui Liu
Abstract:
This article developed an ion thruster optic system sputter erosion depth numerical 3D model by IFE-PIC (Immersed Finite Element-Particle-in-Cell) and Mont Carlo method, and calculated the downstream surface sputter erosion rate of accelerator grid; compared with LIPS-200 life test data. The results of the numerical model are in reasonable agreement with the measured data. Finally, we predicted the lifetime of the 20cm diameter ion thruster via the erosion data obtained with the model. The ultimate result demonstrated that under normal operating condition, the erosion rate of the grooves wears on the downstream surface of the accelerator grid is 34.6μm⁄1000h, which means the conservative lifetime until structural failure occurring on the accelerator grid is 11500 hours.Keywords: Ion thruster, accelerator gird, sputter erosion, lifetime assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20036336 Modular Data and Calculation Framework for a Technology-Based Mapping of the Manufacturing Process According to the Value Stream Management Approach
Authors: Tim Wollert, Fabian Behrendt
Abstract:
Value Stream Management (VSM) is a widely used methodology in the context of Lean Management for improving end-to-end material and information flows from a supplier to a customer from a company’s perspective. Whereas the design principles, e.g. Pull, value-adding, customer-orientation and further ones are still valid against the background of an increasing digitalized and dynamic environment, the methodology itself for mapping a value stream is characterized as time- and resource-intensive due to the high degree of manual activities. The digitalization of processes in the context of Industry 4.0 enables new opportunities to reduce these manual efforts and make the VSM approach more agile. The paper at hand aims at providing a modular data and calculation framework, utilizing the available business data, provided by information and communication technologies for automizing the value stream mapping process with focus on the manufacturing process.
Keywords: Industry 4.0, lean management 4.0, value stream management 4.0, value stream mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3716335 Multi-Agent Systems for Intelligent Clustering
Authors: Jung-Eun Park, Kyung-Whan Oh
Abstract:
Intelligent systems are required in order to quickly and accurately analyze enormous quantities of data in the Internet environment. In intelligent systems, information extracting processes can be divided into supervised learning and unsupervised learning. This paper investigates intelligent clustering by unsupervised learning. Intelligent clustering is the clustering system which determines the clustering model for data analysis and evaluates results by itself. This system can make a clustering model more rapidly, objectively and accurately than an analyzer. The methodology for the automatic clustering intelligent system is a multi-agent system that comprises a clustering agent and a cluster performance evaluation agent. An agent exchanges information about clusters with another agent and the system determines the optimal cluster number through this information. Experiments using data sets in the UCI Machine Repository are performed in order to prove the validity of the system.
Keywords: Intelligent Clustering, Multi-Agent System, PCA, SOM, VC(Variance Criterion)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17276334 Multi-level Metadata Integration System: XML, RDF and RuleML
Authors: Messaouda Fareh, Omar Boussaid, Rachid Challal
Abstract:
Our work is part of the heterogeneous data integration, with the definition of a structural and semantic mediation model. Our aim is to propose architecture for the heterogeneous sources metadata mediation, represented by XML, RDF and RuleML models, providing to the user the metadata transparency. This, by including data structures, of natures fundamentally different, and allowing the decomposition of a query involving multiple sources, to queries specific to these sources, then recompose the result.Keywords: Mediator, Metadata, Query, RDF, RuleML, XML, Xquery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17186333 ISME: Integrated Style Motion Editor for 3D Humanoid Character
Authors: Ismahafezi Ismail, Mohd Shahrizal Sunar
Abstract:
The motion of a realistic 3D humanoid character is very important especially for the industries developing computer animations and games. However, this type of motion is seen with a very complex dimensional data as well as body position, orientation, and joint rotation. Integrated Style Motion Editor (ISME), on the other hand, is a method used to alter the 3D humanoid motion capture data utilised in computer animation and games development. Therefore, this study was carried out with the purpose of demonstrating a method that is able to manipulate and deform different motion styles by integrating Key Pose Deformation Technique and Trajectory Control Technique. This motion editing method allows the user to generate new motions from the original motion capture data using a simple interface control. Unlike the previous method, our method produces a realistic humanoid motion style in real time.
Keywords: Computer animation, humanoid motion, motion capture, motion editing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12736332 An Improved Preprocessing for Biosonar Target Classification
Authors: Turgay Temel, John Hallam
Abstract:
An improved processing description to be employed in biosonar signal processing in a cochlea model is proposed and examined. It is compared to conventional models using a modified discrimination analysis and both are tested. Their performances are evaluated with echo data captured from natural targets (trees).Results indicate that the phase characteristics of low-pass filters employed in the echo processing have a significant effect on class separability for this data.
Keywords: Cochlea model, discriminant analysis, neurospikecoding, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14936331 A Consideration of the Achievement of Productive Level Parallel Programming Skills
Authors: Tadayoshi Horita, Masakazu Akiba, Mina Terauchi, Tsuneo Kanno
Abstract:
This paper gives a consideration of the achievement of productive level parallel programming skills, based on the data of the graduation studies in the Polytechnic University of Japan. The data show that most students can achieve only parallel programming skills during the graduation study (about 600 to 700 hours), if the programming environment is limited to GPGPUs. However, the data also show that it is a very high level task that a student achieves productive level parallel programming skills during only the graduation study. In addition, it shows that the parallel programming environments for GPGPU, such as CUDA and OpenCL, may be more suitable for parallel computing education than other environments such as MPI on a cluster system and Cell.B.E. These results must be useful for the areas of not only software developments, but also hardware product developments using computer technologies.
Keywords: Parallel computing, programming education, GPU, GPGPU, CUDA, OpenCL, MPI, Cell.B.E.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16886330 Impediments to Female Sports Management and Participation: The Experience in the Selected Nigeria South West Colleges of Education
Authors: Saseyi Olaitan Olaoluwa, Osifeko Olalekan Remigious
Abstract:
The study was meant to identify the impediments to female sports management and participation in the selected colleges. Seven colleges of education in the south west parts of the country were selected for the study. A total of one hundred and five subjects were sampled to supply data. Only one hundred adequately completed and returned, copies of the questionnaire were used for data analysis. The collected data were analysed descriptively. The result of the study showed that inadequate fund, personnel, facilities equipment, supplies, management of sports, supervision and coaching were some of the impediments to female sports management and participation. Athletes were not encouraged to participate. Based on the findings, it was recommended that the government should come to the aid of the colleges by providing fund and other needs that will make sports attractive for enhanced participation.Keywords: Female sports, impediments, management, Nigeria, south west, colleges.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16686329 Combined Sewer Overflow forecasting with Feed-forward Back-propagation Artificial Neural Network
Authors: Achela K. Fernando, Xiujuan Zhang, Peter F. Kinley
Abstract:
A feed-forward, back-propagation Artificial Neural Network (ANN) model has been used to forecast the occurrences of wastewater overflows in a combined sewerage reticulation system. This approach was tested to evaluate its applicability as a method alternative to the common practice of developing a complete conceptual, mathematical hydrological-hydraulic model for the sewerage system to enable such forecasts. The ANN approach obviates the need for a-priori understanding and representation of the underlying hydrological hydraulic phenomena in mathematical terms but enables learning the characteristics of a sewer overflow from the historical data. The performance of the standard feed-forward, back-propagation of error algorithm was enhanced by a modified data normalizing technique that enabled the ANN model to extrapolate into the territory that was unseen by the training data. The algorithm and the data normalizing method are presented along with the ANN model output results that indicate a good accuracy in the forecasted sewer overflow rates. However, it was revealed that the accurate forecasting of the overflow rates are heavily dependent on the availability of a real-time flow monitoring at the overflow structure to provide antecedent flow rate data. The ability of the ANN to forecast the overflow rates without the antecedent flow rates (as is the case with traditional conceptual reticulation models) was found to be quite poor.Keywords: Artificial Neural Networks, Back-propagationlearning, Combined sewer overflows, Forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1533