Search results for: data mining technique
8311 Modeling of Random Variable with Digital Probability Hyper Digraph: Data-Oriented Approach
Authors: A. Habibizad Navin, M. Naghian Fesharaki, M. Mirnia, M. Kargar
Abstract:
In this paper we introduce Digital Probability Hyper Digraph for modeling random variable as the hierarchical data-oriented model.Keywords: Data-Oriented Models, Data Structure, DigitalProbability Hyper Digraph, Random Variable, Statistic andProbability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12738310 Effect of Gamma Irradiation on the Crystalline Structure of Poly(Vinylidene Fluoride)
Authors: Adriana Souza M. Batista, Cláubia Pereira, Luiz O. Faria
Abstract:
The irradiation of polymeric materials has received much attention because it can produce diverse changes in chemical structure and physical properties. Thus, studying the chemical and structural changes of polymers is important in practice to achieve optimal conditions for the modification of polymers. The effect of gamma irradiation on the crystalline structure of poly(vinylidene fluoride) (PVDF) has been investigated using differential scanning calorimetry (DSC) and X-ray diffraction techniques (XRD). Gamma irradiation was carried out in atmosphere air with doses between 100 kGy at 3,000 kGy with a Co-60 source. In the melting thermogram of the samples irradiated can be seen a bimodal melting endotherm is detected with two melting temperature. The lower melting temperature is attributed to melting of crystals originally present and the higher melting peak due to melting of crystals reorganized upon heat treatment. These results are consistent with those obtained by XRD technique showing increasing crystallinity with increasing irradiation dose, although the melting latent heat is decreasing.Keywords: Differential scanning calorimetry, gamma irradiation, PVDF, X-ray diffraction technique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16178309 Wireless Transmission of Big Data Using Novel Secure Algorithm
Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha
Abstract:
This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.Keywords: Big data, cooperative jamming, energy balance, physical layer, two-hop transmission, wireless security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21808308 Behavior of Generated Gas in Lost Foam Casting
Authors: M. Khodai, S. M. H. Mirbagheri
Abstract:
In the Lost Foam Casting process, melting point temperature of metal, as well as volume and rate of the foam degradation have significant effect on the mold filling pattern. Therefore, gas generation capacity and gas gap length are two important parameters for modeling of mold filling time of the lost foam casting processes. In this paper, the gas gap length at the liquidfoam interface for a low melting point (aluminum) alloy and a high melting point (Carbon-steel) alloy are investigated by the photography technique. Results of the photography technique indicated, that the gas gap length and the mold filling time are increased with increased coating thickness and density of the foam. The Gas gap lengths measured in aluminum and Carbon-steel, depend on the foam density, and were approximately 4-5 and 25-60 mm, respectively. By using a new system, the gas generation capacity for the aluminum and steel was measured. The gas generation capacity measurements indicated that gas generation in the Aluminum and Carbon-steel lost foam casting was about 50 CC/g and 3200 CC/g polystyrene, respectively.Keywords: gas gap, lost foam casting, photographytechnique.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35018307 Probabilistic Electrical Power Generation Modeling Using Decimal to Binary Conversion
Authors: Ahmed S. Al-Abdulwahab
Abstract:
Generation system reliability assessment is an important task which can be performed using deterministic or probabilistic techniques. The probabilistic approaches have significant advantages over the deterministic methods. However, more complicated modeling is required by the probabilistic approaches. Power generation model is a basic requirement for this assessment. One form of the generation models is the well known capacity outage probability table (COPT). Different analytical techniques have been used to construct the COPT. These approaches require considerable mathematical modeling of the generating units. The unit-s models are combined to build the COPT which will add more burdens on the process of creating the COPT. Decimal to Binary Conversion (DBC) technique is widely and commonly applied in electronic systems and computing This paper proposes a novel utilization of the DBC to create the COPT without engaging in analytical modeling or time consuming simulations. The simple binary representation , “0 " and “1 " is used to model the states o f generating units. The proposed technique is proven to be an effective approach to build the generation model.Keywords: Decimal to Binary, generation, reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20408306 Automated Particle Picking based on Correlation Peak Shape Analysis and Iterative Classification
Authors: Hrabe Thomas, Beck Florian, Nickell Stephan
Abstract:
Cryo-electron microscopy (CEM) in combination with single particle analysis (SPA) is a widely used technique for elucidating structural details of macromolecular assemblies at closeto- atomic resolutions. However, development of automated software for SPA processing is still vital since thousands to millions of individual particle images need to be processed. Here, we present our workflow for automated particle picking. Our approach integrates peak shape analysis to the classical correlation and an iterative approach to separate macromolecules and background by classification. This particle selection workflow furthermore provides a robust means for SPA with little user interaction. Processing simulated and experimental data assesses performance of the presented tools.Keywords: Cryo-electron Microscopy, Single Particle Analysis, Image Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16688305 Feature Extraction of Dorsal Hand Vein Pattern Using a Fast Modified PCA Algorithm Based On Cholesky Decomposition and Lanczos Technique
Authors: Maleika Heenaye- Mamode Khan , Naushad Mamode Khan, Raja K.Subramanian
Abstract:
Dorsal hand vein pattern is an emerging biometric which is attracting the attention of researchers, of late. Research is being carried out on existing techniques in the hope of improving them or finding more efficient ones. In this work, Principle Component Analysis (PCA) , which is a successful method, originally applied on face biometric is being modified using Cholesky decomposition and Lanczos algorithm to extract the dorsal hand vein features. This modified technique decreases the number of computation and hence decreases the processing time. The eigenveins were successfully computed and projected onto the vein space. The system was tested on a database of 200 images and using a threshold value of 0.9 to obtain the False Acceptance Rate (FAR) and False Rejection Rate (FRR). This modified algorithm is desirable when developing biometric security system since it significantly decreases the matching time.
Keywords: Dorsal hand vein pattern, PCA, Cholesky decomposition, Lanczos algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18378304 Comparative Analysis of DTC Based Switched Reluctance Motor Drive Using Torque Equation and FEA Models
Authors: P. Srinivas, P. V. N. Prasad
Abstract:
Since torque ripple is the main cause of noise and vibrations, the performance of Switched Reluctance Motor (SRM) can be improved by minimizing its torque ripple using a novel control technique called Direct Torque Control (DTC). In DTC technique, torque is controlled directly through control of magnitude of the flux and change in speed of the stator flux vector. The flux and torque are maintained within set hysteresis bands.
The DTC of SRM is analyzed by two methods. In one method, the actual torque is computed by conducting Finite Element Analysis (FEA) on the design specifications of the motor. In the other method, the torque is computed by Simplified Torque Equation. The variation of peak current, average current, torque ripple and speed settling time with Simplified Torque Equation model is compared with FEA based model.
Keywords: Direct Toque Control, Simplified Torque Equation, Finite Element Analysis, Torque Ripple.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35038303 Application of Ultrasonic Assisted Machining Technique for Glass-Ceramic Milling
Authors: S. Y. Lin, C. H. Kuan, C. H. She, W. T. Wang
Abstract:
In this study, ultrasonic assisted machining (UAM) technique is applied in side-surface milling experiment for glass-ceramic workpiece material. The tungsten carbide cutting-tool with diamond coating is used in conjunction with two kinds of cooling/lubrication mediums such as water-soluble (WS) cutting fluid and minimum quantity lubricant (MQL). Full factorial process parameter combinations on the milling experiments are planned to investigate the effect of process parameters on cutting performance. From the experimental results, it tries to search for the better process parameter combination which the edge-indentation and the surface roughness are acceptable. In the machining experiments, ultrasonic oscillator was used to excite a cutting-tool along the radial direction producing a very small amplitude of vibration frequency of 20KHz to assist the machining process. After processing, toolmaker microscope was used to detect the side-surface morphology, edge-indentation and cutting tool wear under different combination of cutting parameters, and analysis and discussion were also conducted for experimental results. The results show that the main leading parameters to edge-indentation of glass ceramic are cutting depth and feed rate. In order to reduce edge-indentation, it needs to use lower cutting depth and feed rate. Water-soluble cutting fluid provides a better cooling effect in the primary cutting area; it may effectively reduce the edge-indentation and improve the surface morphology of the glass ceramic. The use of ultrasonic assisted technique can effectively enhance the surface finish cleanness and reduce cutting tool wear and edge-indentation.
Keywords: Glass-ceramic, ultrasonic assisted machining, cutting performance, edge-indentation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28008302 Impact of Stack Caches: Locality Awareness and Cost Effectiveness
Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang
Abstract:
Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.
Keywords: Hit rate, Locality of program, Stack cache, and Stack data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15088301 Probe-Assisted Axillary Lymph Node Biopsy Compared with Axillary Dissection in Breast Cancer: A Retrospective Study from the West of Iran
Authors: Morteza Alizadeh Foroutan, Hassan Moayeri, Keivan Sabooni, Motahareh Rouhi Ardeshiri
Abstract:
Breast cancer incidence is annually increasing in various parts of the world, and sentinel lymph node biopsy (SLNB) has turned into a new standard for care as a staging process in this regard. In the present study, the gamma probe technique was used for SLNB as a safe method with more accuracy and less complications. The study sought to compare the results of two surgical techniques, namely, axillary lymph node dissection (ALND) and SLNB, including epidemiological results and clinicopathological features of BC patients from the western provinces of Iran. In general, 420 BC women were identified who referred to the breast clinic in Sanandaj, Kurdistan province during 2017-2021. Of whom, 318 patients underwent breast surgery, and from these patients, 277 cases participated in the current study. Patients were divided into those undergoing ALND and SLNB. The criteria for complete dissection or axillary biopsy using the gamma probe were based on the results of clinical examinations and the presence of palpable lymph nodes. Overall complications after surgery belonged to 58 (18.9%) cases, including 15 (25.9%) and 43 (74.1%) patients in the SLNB and ALND groups, respectively (P = 0.74). Based on the findings, Seroma (60.3%) was the most reported complication in each group. Most patients had tumors in the upper-outer quadrant of their left breast. The mean of the tumor dimension in the SLNB and ALND groups was 2.1 ± 1.3 cm and 3.2 ± 1.8 cm, respectively, (P = 0.003). The benefits of breast-conserving surgery (BCS) with the SLNB technique are clearly undeniable and can be considered a method with less complications and a better prognosis. Accordingly, SLNB and BCS are favorable methods that can be performed, along with gamma probe technique, which is safe and accurate.
Keywords: Breast cancer, Sentinel lymph node biopsy, Axillary lymph node dissection, Gamma probe.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 388300 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. Earlier we predicted the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven datasets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: Software Metrics, Fault prediction, Cross project, Within project.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25468299 Female Labor Force Participation in Third World Countries: An Empirical Analysis
Authors: Anam Azam, Muhammad Rafiq
Abstract:
The study identified the socio-economic and demographic factors of both married and unmarried females in third world countries. Almost all the countries have same problems but we have selected Pakistan as a sample country. The main purpose of this study was to examine which factors forced women to participate in labor market. So the best technique of data collection was survey of both married and unmarried females between the ages of 20 to 49. Two models (probit and logit) were used to analyze the factors which effect on FLFP. The result showed that some factors e.g. age; education and marital status have significant effect on FLFP. The findings showed that educated women and those who belong to joint families are more participate because of financial pressure.
Keywords: Education, Financial status, Family pressure Labor Market participation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20298298 Economic and Environmental Benefits of the Best Available Technique Application in a Food Processing Plant
Authors: Frantisek Bozek, Pavel Budinsky, Ignac Hoza, Alexandr Bozek, Magdalena Naplavova
Abstract:
A cleaner production project was implemented in a bakery. The project is based on the substitution of the best available technique for an obsolete leaven production technology. The new technology enables production of durable, high-quality leavens. Moreover, 25% of flour as the original raw material can be replaced by pastry from the previous day production which has not been sold. That pastry was previously disposed in a waste incineration plant. Besides the environmental benefits resulting from less waste, lower consumption of energy, reduction of sewage waters quantity and floury dustiness there are also significant economic benefits. Payback period of investment was calculated with help of static method of financial analysis about 2.6 years, using dynamic method 3.5 years and an internal rate of return more than 29%. The supposed annual average profit after taxationin the second year of operation was incompliance with the real profit.
Keywords: Bakery, best available technology, cleaner production, costs, economic benefit, efficiency, energy, environmental benefit, investment, savings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21098297 Extreme Temperature Forecast in Mbonge, Cameroon through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution
Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph
Abstract:
In this paper, temperature extremes are forecast by employing the block maxima method of the Generalized extreme value(GEV) distribution to analyse temperature data from the Cameroon Development Corporation (C.D.C). By considering two sets of data (Raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data while in the simulated data, the return values show an increasing trend but with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend but with an upper bound. This clearly shows that temperatures in the tropics even-though show a sign of increasing in the future, there is a maximum temperature at which there is no exceedence. The results of this paper are very vital in Agricultural and Environmental research.Keywords: Return level, Generalized extreme value (GEV), Meteorology, Forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21068296 QSI Dynamical Fetch Policy for SMT
Authors: Shu-Chiao Yang, Jong-Jiann Shieh
Abstract:
A Simultaneous Multithreading (SMT) Processor is capable of executing instructions from multiple threads in the same cycle. SMT in fact was introduced as a powerful architecture to superscalar to increase the throughput of the processor. Simultaneous Multithreading is a technique that permits multiple instructions from multiple independent applications or threads to compete limited resources each cycle. While the fetch unit has been identified as one of the major bottlenecks of SMT architecture, several fetch schemes were proposed by prior works to enhance the fetching efficiency and overall performance. In this paper, we propose a novel fetch policy called queue situation identifier (QSI) which counts some kind of long latency instructions of each thread each cycle then properly selects which threads to fetch next cycle. Simulation results show that in best case our fetch policy can achieve 30% on speedup and also can reduce the data cache level 1 miss rate.Keywords: SMT, QSI, DL1 miss rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12698295 Designing Social Media into Higher Education Courses
Authors: Thapanee Seechaliao
Abstract:
This research paper presents guiding on how to design social media into higher education courses. The research methodology used a survey approach. The research instrument was a questionnaire about guiding on how to design social media into higher education courses. Thirty-one lecturers completed the questionnaire. The data were scored by frequency and percentage. The research results were the lecturers’ opinions concerning the designing social media into higher education courses as follows: 1) Lecturers deem that the most suitable learning theory is Collaborative Learning. 2) Lecturers consider that the most important learning and innovation Skill in the 21st century is communication and collaboration skills. 3) Lecturers think that the most suitable evaluation technique is authentic assessment. 4) Lecturers consider that the most appropriate portion used as blended learning should be 70% in the classroom setting and 30% online.Keywords: Instructional design, social media, courses, higher education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20478294 Mining News Sites to Create Special Domain News Collections
Authors: David B. Bracewell, Fuji Ren, Shingo Kuroiwa
Abstract:
We present a method to create special domain collections from news sites. The method only requires a single sample article as a seed. No prior corpus statistics are needed and the method is applicable to multiple languages. We examine various similarity measures and the creation of document collections for English and Japanese. The main contributions are as follows. First, the algorithm can build special domain collections from as little as one sample document. Second, unlike other algorithms it does not require a second “general" corpus to compute statistics. Third, in our testing the algorithm outperformed others in creating collections made up of highly relevant articles.Keywords: Information Retrieval, News, Special DomainCollections,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14878293 An Ant-based Clustering System for Knowledge Discovery in DNA Chip Analysis Data
Authors: Minsoo Lee, Yun-mi Kim, Yearn Jeong Kim, Yoon-kyung Lee, Hyejung Yoon
Abstract:
Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.
Keywords: Ant colony system, biological data, clustering, DNA chip.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19748292 The Resource Description Framework (RDF) as a Modern Structure for Medical Data
Authors: Gabriela Lindemann, Danilo Schmidt, Thomas Schrader, Dietmar Keune
Abstract:
The amount and heterogeneity of data in biomedical research, notably in interdisciplinary fields, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charité - University Hospital Berlin has established together with the German Research Foundation (DFG) a new information service centre for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). Beside a collaborative aspect to create new research groups every single partner or institution of this science information centre making his own data available is allowed to search the whole data pool of the various involved centres. A core task is the implementation of a non-restricting open data structure for the various different data sources. We decided to use a modern RDF model and in a first phase transformed original data coming from the web-based Electronic Patient Record database TBase©.
Keywords: Medical databases, Resource Description Framework (RDF), metadata repository.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20318291 Kinematic Parameter-Independent Modeling and Measuring of Three-Axis Machine Tools
Authors: Yung-Yuan Hsu
Abstract:
The primary objective of this paper was to construct a “kinematic parameter-independent modeling of three-axis machine tools for geometric error measurement" technique. Improving the accuracy of the geometric error for three-axis machine tools is one of the machine tools- core techniques. This paper first applied the traditional method of HTM to deduce the geometric error model for three-axis machine tools. This geometric error model was related to the three-axis kinematic parameters where the overall errors was relative to the machine reference coordinate system. Given that the measurement of the linear axis in this model should be on the ideal motion axis, there were practical difficulties. Through a measurement method consolidating translational errors and rotational errors in the geometric error model, we simplified the three-axis geometric error model to a kinematic parameter-independent model. Finally, based on the new measurement method corresponding to this error model, we established a truly practical and more accurate error measuring technique for three-axis machine tools.Keywords: Three-axis machine tool, Geometric error, HTM, Error measuring
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21228290 Pre-germinated Parboiled Brown Rice Drying Using Fluidization Technique
Authors: Nattapol Poomsa-ad, Lamul Wiset
Abstract:
Pre-germinated parboiled brown rice or Khao hang (in Thai) is paddy which undergoing the processes of soaking, steaming, drying and dehusking to obtain the edible form for consumption. The objectives of this research were to study the kinetic of pre-germinated parboiled brown rice drying using fluidization technique and to study the properties of pre-germinated parboiled brown rice after drying. The dryings were performed at the different temperatures of 110, 120 and 130 oC at the bed depth of 2 cm with the air velocity of 1.98 m/s. The results found that the higher drying temperature led to the faster moisture reduction. After drying until the moisture content of pre-germinated parboiled brown rice was lower than 14%wet basis, samples were taken to determine various qualities such as percentage of head rice and L* a* b* color values. The shade drying was used as a control. The results found that the higher drying temperature resulted in the decrease of head rice percentage. For the color assessment, the trend of L* and a* values was increased with the drying temperature, while the b* value was not significantly difference (p › 0.05) by drying temperatures. However, the b value of drying by fluidized bed dryer was higher than the control.
Keywords: Brown rice, dehydration, fluidized bed, grain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22808289 An Efficient Motion Recognition System Based on LMA Technique and a Discrete Hidden Markov Model
Authors: Insaf Ajili, Malik Mallem, Jean-Yves Didier
Abstract:
Human motion recognition has been extensively increased in recent years due to its importance in a wide range of applications, such as human-computer interaction, intelligent surveillance, augmented reality, content-based video compression and retrieval, etc. However, it is still regarded as a challenging task especially in realistic scenarios. It can be seen as a general machine learning problem which requires an effective human motion representation and an efficient learning method. In this work, we introduce a descriptor based on Laban Movement Analysis technique, a formal and universal language for human movement, to capture both quantitative and qualitative aspects of movement. We use Discrete Hidden Markov Model (DHMM) for training and classification motions. We improve the classification algorithm by proposing two DHMMs for each motion class to process the motion sequence in two different directions, forward and backward. Such modification allows avoiding the misclassification that can happen when recognizing similar motions. Two experiments are conducted. In the first one, we evaluate our method on a public dataset, the Microsoft Research Cambridge-12 Kinect gesture data set (MSRC-12) which is a widely used dataset for evaluating action/gesture recognition methods. In the second experiment, we build a dataset composed of 10 gestures(Introduce yourself, waving, Dance, move, turn left, turn right, stop, sit down, increase velocity, decrease velocity) performed by 20 persons. The evaluation of the system includes testing the efficiency of our descriptor vector based on LMA with basic DHMM method and comparing the recognition results of the modified DHMM with the original one. Experiment results demonstrate that our method outperforms most of existing methods that used the MSRC-12 dataset, and a near perfect classification rate in our dataset.Keywords: Human Motion Recognition, Motion representation, Laban Movement Analysis, Discrete Hidden Markov Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7288288 XML Data Management in Compressed Relational Database
Authors: Hongzhi Wang, Jianzhong Li, Hong Gao
Abstract:
XML is an important standard of data exchange and representation. As a mature database system, using relational database to support XML data may bring some advantages. But storing XML in relational database has obvious redundancy that wastes disk space, bandwidth and disk I/O when querying XML data. For the efficiency of storage and query XML, it is necessary to use compressed XML data in relational database. In this paper, a compressed relational database technology supporting XML data is presented. Original relational storage structure is adaptive to XPath query process. The compression method keeps this feature. Besides traditional relational database techniques, additional query process technologies on compressed relations and for special structure for XML are presented. In this paper, technologies for XQuery process in compressed relational database are presented..Keywords: XML, compression, query processing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18068287 Smart Surveillance using PDA
Authors: Basem Mustafa Abd. Amer , Syed Abdul Rahman Al-Attas
Abstract:
The aim of this research is to develop a fast and reliable surveillance system based on a personal digital assistant (PDA) device. This is to extend the capability of the device to detect moving objects which is already available in personal computers. Secondly, to compare the performance between Background subtraction (BS) and Temporal Frame Differencing (TFD) techniques for PDA platform as to which is more suitable. In order to reduce noise and to prepare frames for the moving object detection part, each frame is first converted to a gray-scale representation and then smoothed using a Gaussian low pass filter. Two moving object detection schemes i.e., BS and TFD have been analyzed. The background frame is updated by using Infinite Impulse Response (IIR) filter so that the background frame is adapted to the varying illuminate conditions and geometry settings. In order to reduce the effect of noise pixels resulting from frame differencing morphological filters erosion and dilation are applied. In this research, it has been found that TFD technique is more suitable for motion detection purpose than the BS in term of speed. On average TFD is approximately 170 ms faster than the BS techniqueKeywords: Surveillance, PDA, Motion Detection, ImageProcessing , Background Subtraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17598286 A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data
Authors: P. Kaladevi, N. Giridharan
Abstract:
The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.Keywords: Big Data, Hadoop, HDFS, Caching, MapReduce, web personalization, e-governance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15928285 Delay-dependent Stability Analysis for Uncertain Switched Neutral System
Authors: Lianglin Xiong, Shouming Zhong, Mao Ye
Abstract:
This paper considers the robust exponential stability issues for a class of uncertain switched neutral system which delays switched according to the switching rule. The system under consideration includes both stable and unstable subsystems. The uncertainties considered in this paper are norm bounded, and possibly time varying. Based on multiple Lyapunov functional approach and dwell-time technique, the time-dependent switching rule is designed depend on the so-called average dwell time of stable subsystems as well as the ratio of the total activation time of stable subsystems and unstable subsystems. It is shown that by suitably controlling the switching between the stable and unstable modes, the robust stabilization of the switched uncertain neutral systems can be achieved. Two simulation examples are given to demonstrate the effectiveness of the proposed method.
Keywords: Switched neutral system, exponential stability, multiple Lyapunov functional, dwell time technique, time-dependent switching rule.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13928284 Simplified Space Vector Based Decoupled Switching Strategy for Indirect Vector Controlled Open-End Winding Induction Motor Drive
Authors: Syed Munvar Ali, V. Vijaya Kumar Reddy, M. Surya Kalavathi
Abstract:
In this paper, a dual inverter configuration has been implemented for induction motor drive. This isolated dual inverter is capable to produce high quality of output voltage and minimize common mode voltage (CMV). To this isolated dual inverter a decoupled space vector based pulse width modulation (PWM) technique is proposed. Conventional space vector based PWM (SVPWM) techniques require reference voltage vector calculation and sector identification. The proposed decoupled SVPWM technique generates gating pulses from instantaneous phase voltages and gives a CMV of ±vdc/6. To evaluate proposed algorithm MATLAB based simulation studies are carried on indirect vector controlled open end winding induction motor drive.Keywords: Inverter configuration, decoupled SVPWM, common mode voltage, vector control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7338283 A Practical Construction Technique to Enhance the Performance of Rock Bolts in Tunnels
Authors: O. Chaudhari, A. N. Ghafar, G. Zirgulis, M. Mousavi, T. Ellison, S. Pousette, P. Fontana
Abstract:
In Swedish tunnel construction, a critical issue that has been repeatedly acknowledged is corrosion and, consequently, failure of the rock bolts in rock support systems. The defective installation of rock bolts results in the formation of cavities in the cement mortar that is regularly used to fill the area under the dome plates. These voids allow for water-ingress to the rock bolt assembly, which results in corrosion of rock bolt components and eventually failure. In addition, the current installation technique consists of several manual steps with intense labor works that are usually done in uncomfortable and exhausting conditions, e.g., under the roof of the tunnels. Such intense tasks also lead to a considerable waste of materials and execution errors. Moreover, adequate quality control of the execution is hardly possible with the current technique. To overcome these issues, a non-shrinking/expansive cement-based mortar filled in the paper packaging has been developed in this study which properly fills the area under the dome plates without or with the least remaining cavities, ultimately that diminishes the potential of corrosion. This article summarizes the development process and the experimental evaluation of this technique for the installation of rock bolts. In the development process, the cementitious mortar was first developed using specific cement and shrinkage reducing/expansive additives. The mechanical and flow properties of the mortar were then evaluated using compressive strength, density, and slump flow measurement methods. In addition, isothermal calorimetry and shrinkage/expansion measurements were used to elucidate the hydration and durability attributes of the mortar. After obtaining the desired properties in both fresh and hardened conditions, the developed dry mortar was filled in specific permeable paper packaging and then submerged in water bath for specific intervals before the installation. The tests were enhanced progressively by optimizing different parameters such as shape and size of the packaging, characteristics of the paper used, immersion time in water and even some minor characteristics of the mortar. Finally, the developed prototype was tested in a lab-scale rock bolt assembly with various angles to analyze the efficiency of the method in real life scenario. The results showed that the new technique improves the performance of the rock bolts by reducing the material wastage, improving environmental performance, facilitating and accelerating the labor works, and finally enhancing the durability of the whole system. Accordingly, this approach provides an efficient alternative for the traditional way of tunnel bolt installation with considerable advantages for the Swedish tunneling industry.
Keywords: corrosion, durability, mortar, rock bolt
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4198282 Pre-Service EFL Teachers' Perceptions of Written Corrective Feedback in a Wiki-Based Environment
Authors: Mabel Ortiz, Claudio Díaz
Abstract:
This paper explores Chilean pre-service teachers' perceptions about the provision of corrective feedback in a wiki environment during the collaborative writing of an argumentative essay. After conducting a semi-structured interview on 22 participants, the data were processed through the content analysis technique. The results show that students have positive perceptions about corrective feedback, provided through a wiki virtual environment, which in turn facilitates feedback provision and impacts language learning effectively. Some of the positive perceptions about virtual feedback refer to permanent access, efficiency, simultaneous revision and immediacy. It would then be advisable to integrate wiki-based feedback as a methodology for the language classroom and collaborative writing tasks.
Keywords: Argumentative essay, focused corrective feedback, perception, wiki environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 986