Search results for: Text Processing
1724 Improving Topic Quality of Scripts by Using Scene Similarity Based Word Co-Occurrence
Authors: Yunseok Noh, Chang-Uk Kwak, Sun-Joong Kim, Seong-Bae Park
Abstract:
Scripts are one of the basic text resources to understand broadcasting contents. Topic modeling is the method to get the summary of the broadcasting contents from its scripts. Generally, scripts represent contents descriptively with directions and speeches, and provide scene segments that can be seen as semantic units. Therefore, a script can be topic modeled by treating a scene segment as a document. Because scene segments consist of speeches mainly, however, relatively small co-occurrences among words in the scene segments are observed. This causes inevitably the bad quality of topics by statistical learning method. To tackle this problem, we propose a method to improve topic quality with additional word co-occurrence information obtained using scene similarities. The main idea of improving topic quality is that the information that two or more texts are topically related can be useful to learn high quality of topics. In addition, more accurate topical representations lead to get information more accurate whether two texts are related or not. In this paper, we regard two scene segments are related if their topical similarity is high enough. We also consider that words are co-occurred if they are in topically related scene segments together. By iteratively inferring topics and determining semantically neighborhood scene segments, we draw a topic space represents broadcasting contents well. In the experiments, we showed the proposed method generates a higher quality of topics from Korean drama scripts than the baselines.Keywords: Broadcasting contents, generalized P´olya urn model, scripts, text similarity, topic model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18171723 An Implementation of Data Reusable MPEG Video Coding Scheme
Authors: Vasily G. Moshnyaga
Abstract:
This paper presents an optimized MPEG2 video codec implementation, which drastically reduces the number of computations and memory accesses required for video compression. Unlike traditional scheme, we reuse data stored in frame memory to omit unnecessary coding operations and memory read/writes for unchanged macroblocks. Due to dynamic memory sharing among reference frames, data-driven macroblock characterization and selective macroblock processing, we perform less than 15% of the total operations required by a conventional coder while maintaining high picture quality.
Keywords: Data reuse, adaptive processing, video coding, MPEG
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12661722 Texture Feature Extraction using Slant-Hadamard Transform
Authors: M. J. Nassiri, A. Vafaei, A. Monadjemi
Abstract:
Random and natural textures classification is still one of the biggest challenges in the field of image processing and pattern recognition. In this paper, texture feature extraction using Slant Hadamard Transform was studied and compared to other signal processing-based texture classification schemes. A parametric SHT was also introduced and employed for natural textures feature extraction. We showed that a subtly modified parametric SHT can outperform ordinary Walsh-Hadamard transform and discrete cosine transform. Experiments were carried out on a subset of Vistex random natural texture images using a kNN classifier.Keywords: Texture Analysis, Slant Transform, Hadamard, DCT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26731721 Retina Based Mouse Control (RBMC)
Authors: Arslan Qamar Malik, Jehanzeb Ahmad
Abstract:
The paper presents a novel idea to control computer mouse cursor movement with human eyes. In this paper, a working of the product has been described as to how it helps the special people share their knowledge with the world. Number of traditional techniques such as Head and Eye Movement Tracking Systems etc. exist for cursor control by making use of image processing in which light is the primary source. Electro-oculography (EOG) is a new technology to sense eye signals with which the mouse cursor can be controlled. The signals captured using sensors, are first amplified, then noise is removed and then digitized, before being transferred to PC for software interfacing.Keywords: Human Computer Interaction, Real-Time System, Electro-oculography, Signal Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42471720 Adjusting the Furnace and Converter Temperature of the Sulfur Recovery Units
Authors: Hamid Reza Mahdipoor, Hamid Ganji, Hamed Naderi, Hajar Yousefian, Hooman Javaherizadeh
Abstract:
The modified Claus process is commonly used in oil refining and gas processing to recover sulfur and destroy contaminants formed in upstream processing. A Claus furnace feed containing a relatively low concentration of H2S may be incapable of producing a stable flame. Also, incomplete combustion of hydrocarbons in the feed can lead to deterioration of the catalyst in the reactors due to soot or carbon deposition. Therefore, special consideration is necessary to achieve the appropriate overall sulfur recovery. In this paper, some configurations available to treat lean acid gas streams are described and the most appropriate ones are studied to overcome low H2S concentration problems. As a result, overall sulfur recovery is investigated for feed preheating and hot gas configurations.Keywords: Sulfur recovery unit, Low H2S content
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46321719 A Novel Method for Behavior Modeling in Uncertain Information Systems
Authors: Ali Haroonabadi, Mohammad Teshnehlab
Abstract:
None of the processing models in the software development has explained the software systems performance evaluation and modeling; likewise, there exist uncertainty in the information systems because of the natural essence of requirements, and this may cause other challenges in the processing of software development. By definition an extended version of UML (Fuzzy- UML), the functional requirements of the software defined uncertainly would be supported. In this study, the behavioral description of uncertain information systems by the aid of fuzzy-state diagram is crucial; moreover, the introduction of behavioral diagrams role in F-UML is investigated in software performance modeling process. To get the aim, a fuzzy sub-profile is used.Keywords: Fuzzy System, Software Development Model, Software Performance Evaluation, UML
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24971718 A New Approach to Design an Efficient CIC Decimator Using Signed Digit Arithmetic
Authors: Vishal Awasthi, Krishna Raj
Abstract:
Any digital processing performed on a signal with larger nyquist interval requires more computation than signal processing performed on smaller nyquist interval. The sampling rate alteration generates the unwanted effects in the system such as spectral aliasing and spectral imaging during signal processing. Multirate-multistage implementation of digital filter can result a significant computational saving than single rate filter designed for sample rate conversion. In this paper, we presented an efficient cascaded integrator comb (CIC) decimation filter that perform fast down sampling using signed digit adder algorithm with compensated frequency droop that arises due to aliasing effect during the decimation process. This proposed compensated CIC decimation filter structure with a hybrid signed digit (HSD) fast adder provide an improved performance in terms of down sampling speed by 65.15% than ripple carry adder (RCA) and reduced area and power by 57.5% and 0.01 % than signed digit (SD) adder algorithms respectively.
Keywords: Sampling rate conversion, Multirate Filtering, Compensation Theory, Decimation filter, CIC filter, Redundant signed digit arithmetic, Fast adders.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48921717 FPGA Hardware Implementation and Evaluation of a Micro-Network Architecture for Multi-Core Systems
Authors: Yahia Salah, Med Lassaad Kaddachi, Rached Tourki
Abstract:
This paper presents the design, implementation and evaluation of a micro-network, or Network-on-Chip (NoC), based on a generic pipeline router architecture. The router is designed to efficiently support traffic generated by multimedia applications on embedded multi-core systems. It employs a simplest routing mechanism and implements the round-robin scheduling strategy to resolve output port contentions and minimize latency. A virtual channel flow control is applied to avoid the head-of-line blocking problem and enhance performance in the NoC. The hardware design of the router architecture has been implemented at the register transfer level; its functionality is evaluated in the case of the two dimensional Mesh/Torus topology, and performance results are derived from ModelSim simulator and Xilinx ISE 9.2i synthesis tool. An example of a multi-core image processing system utilizing the NoC structure has been implemented and validated to demonstrate the capability of the proposed micro-network architecture. To reduce complexity of the image compression and decompression architecture, the system use image processing algorithm based on classical discrete cosine transform with an efficient zonal processing approach. The experimental results have confirmed that both the proposed image compression scheme and NoC architecture can achieve a reasonable image quality with lower processing time.
Keywords: Generic Pipeline Network-on-Chip Router Architecture, JPEG Image Compression, FPGA Hardware Implementation, Performance Evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30971716 Morpho-Phonological Modelling in Natural Language Processing
Authors: Eleni Galiotou, Angela Ralli
Abstract:
In this paper we propose a computational model for the representation and processing of morpho-phonological phenomena in a natural language, like Modern Greek. We aim at a unified treatment of inflection, compounding, and word-internal phonological changes, in a model that is used for both analysis and generation. After discussing certain difficulties cuase by well-known finitestate approaches, such as Koskenniemi-s two-level model [7] when applied to a computational treatment of compounding, we argue that a morphology-based model provides a more adequate account of word-internal phenomena. Contrary to the finite state approaches that cannot handle hierarchical word constituency in a satisfactory way, we propose a unification-based word grammar, as the nucleus of our strategy, which takes into consideration word representations that are based on affixation and [stem stem] or [stem word] compounds. In our formalism, feature-passing operations are formulated with the use of the unification device, and phonological rules modeling the correspondence between lexical and surface forms apply at morpheme boundaries. In the paper, examples from Modern Greek illustrate our approach. Morpheme structures, stress, and morphologically conditioned phoneme changes are analyzed and generated in a principled way.
Keywords: Morpho-Phonology, Natural Language Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21301715 Advanced Image Analysis Tools Development for the Early Stage Bronchial Cancer Detection
Authors: P. Bountris, E. Farantatos, N. Apostolou
Abstract:
Autofluorescence (AF) bronchoscopy is an established method to detect dysplasia and carcinoma in situ (CIS). For this reason the “Sotiria" Hospital uses the Karl Storz D-light system. However, in early tumor stages the visualization is not that obvious. With the help of a PC, we analyzed the color images we captured by developing certain tools in Matlab®. We used statistical methods based on texture analysis, signal processing methods based on Gabor models and conversion algorithms between devicedependent color spaces. Our belief is that we reduced the error made by the naked eye. The tools we implemented improve the quality of patients' life.Keywords: Bronchoscopy, digital image processing, lung cancer, texture analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14331714 Performance Evaluation of Compression Algorithms for Developing and Testing Industrial Imaging Systems
Authors: Daniel F. Garcia, Julio Molleda, Francisco Gonzalez, Ruben Usamentiaga
Abstract:
The development of many measurement and inspection systems of products based on real-time image processing can not be carried out totally in a laboratory due to the size or the temperature of the manufactured products. Those systems must be developed in successive phases. Firstly, the system is installed in the production line with only an operational service to acquire images of the products and other complementary signals. Next, a recording service of the image and signals must be developed and integrated in the system. Only after a large set of images of products is available, the development of the real-time image processing algorithms for measurement or inspection of the products can be accomplished under realistic conditions. Finally, the recording service is turned off or eliminated and the system operates only with the real-time services for the acquisition and processing of the images. This article presents a systematic performance evaluation of the image compression algorithms currently available to implement a real-time recording service. The results allow establishing a trade off between the reduction or compression of the image size and the CPU time required to get that compression level.Keywords: Lossless image compression, codec performanceevaluation, grayscale codec comparison, real-time image recording.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14191713 Enhancing the Performance of Wireless Sensor Networks Using Low Power Design
Authors: N. Mahendran, R. Madhuranthi
Abstract:
Wireless sensor networks (WSNs), are constantly in demand to process information more rapidly with less energy and area cost. Presently, processor based solutions have difficult to achieve high processing speed with low-power consumption. This paper presents a simple and accurate data processing scheme for low power wireless sensor node, based on reduced number of processing element (PE). The presented model provides a simple recursive structure (SRS) to process the sampled data in the wireless sensor environment and to reduce the power consumption in wireless sensor node. Based on this model, to process the incoming samples and produce a smaller amount of data sufficient to reconstruct the original signal. The ModelSim simulator used to simulate SRS structure. Functional simulation is carried out for the validation of the presented architecture. Xilinx Power Estimator (XPE) tool is used to measure the power consumption. The experimental results show the average power consumption of 91 mW; this is 42% improvement compared to the folded tree architecture.Keywords: Power consumption, energy efficiency, low power WSN node, recursive structure, sleep/wake scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10141712 Fuzzy Mathematical Morphology approach in Image Processing
Authors: Yee Yee Htun, Dr. Khaing Khaing Aye
Abstract:
Morphological operators transform the original image into another image through the interaction with the other image of certain shape and size which is known as the structure element. Mathematical morphology provides a systematic approach to analyze the geometric characteristics of signals or images, and has been applied widely too many applications such as edge detection, objection segmentation, noise suppression and so on. Fuzzy Mathematical Morphology aims to extend the binary morphological operators to grey-level images. In order to define the basic morphological operations such as fuzzy erosion, dilation, opening and closing, a general method based upon fuzzy implication and inclusion grade operators is introduced. The fuzzy morphological operations extend the ordinary morphological operations by using fuzzy sets where for fuzzy sets, the union operation is replaced by a maximum operation, and the intersection operation is replaced by a minimum operation. In this work, it consists of two articles. In the first one, fuzzy set theory, fuzzy Mathematical morphology which is based on fuzzy logic and fuzzy set theory; fuzzy Mathematical operations and their properties will be studied in details. As a second part, the application of fuzziness in Mathematical morphology in practical work such as image processing will be discussed with the illustration problems.Keywords: Binary Morphological, Fuzzy sets, Grayscalemorphology, Image processing, Mathematical morphology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32471711 Bioceramic Scaffolds Fabrication by Rapid Prototyping Technology
Authors: F.H. Liu, S.H. Chen, R.T. Lee, W.S. Lin, Y.S. Liao
Abstract:
This paper describes a rapid prototyping (RP) technology for forming a hydroxyapatite (HA) bone scaffold model. The HA powder and a silica sol are mixed into bioceramic slurry form under a suitable viscosity. The HA particles are embedded in the solidified silica matrix to form green parts via a wide range of process parameters after processing by selective laser sintering (SLS). The results indicate that the proposed process was possible to fabricate multilayers and hollow shell structure with brittle property but sufficient integrity for handling prior to post-processing. The fabricated bone scaffold models had a surface finish of 25Keywords: bioceramic, bone scaffold, rapid prototyping, selective laser sintering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17191710 The Effects of Weather Anomalies on the Quantitative and Qualitative Parameters of Maize Hybrids of Different Genetic Traits in Hungary
Authors: Zs. J. Becze, Á. Krivián, M. Sárvári
Abstract:
Hybrid selection and the application of hybrid specific production technologies are important in terms of the increase of the yield and crop safety of maize. The main explanation for this is climate change, since weather extremes are going on and seem to accelerate in Hungary too.
The biological bases, the selection of appropriate hybrids will be of greater importance in the future. The issue of the adaptability of hybrids will be considerably appreciated. Its good agronomical traits and stress bearing against climatic factors and agrotechnical elements (e.g. different types of herbicides) will be important. There have been examples of 3-4 consecutive droughty years in the past decades, e.g. 1992-1993-1994 or 2009-2011-2012, which made the results of crop production critical. Irrigation cannot be the solution for the problem since currently only the 2% of the arable land is irrigated. Temperatures exceeding the multi-year average are characteristic mainly to the July and August in Hungary, which significantly increase the soil surface evaporation, thus further enhance water shortage. In terms of the yield and crop safety of maize, the weather of these two months is crucial, since the extreme high temperature in July decreases the viability of the pollen and the pistil of maize, decreases the extent of fertilization and makes grain-filling tardy. Consequently, yield and crop safety decrease.
Keywords: Abiotic factors, drought, nutrition content, yield.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19001709 Determination and Comparison of Fabric Pills Distribution Using Image Processing and Spatial Data Analysis Tools
Authors: Lenka Techniková, Maroš Tunák, Jiří Janáček
Abstract:
This work deals with the determination and comparison of pill patterns in 2 sets of fabric samples which differ in way of pill creation. The first set contains fabric samples with the pills created by simulation on a Martindale abrasion machine, while pills in the second set originated during normal wearing and maintenance. The goal of the study is to determine whether the pattern of the fabric pills created by simulation is the same as the pattern of naturally occurring pills. The system of determination and comparison of the pills is based on image processing and spatial data analysis tools. Firstly, 3D reconstruction of the fabric surfaces with the pills is realized with using a gradient fields method. The gradient fields method creates a 3D fabric surface from a set of 4 images. Thereafter, the pills are detected in 3D fabric surfaces using image-processing tools in the MATLAB software. Determination and comparison of the pills patterns of two sets of fabric samples is based on spatial data analysis using tools in R software.
Keywords: 3D reconstruction of the surface, image analysis tools, distribution of the pills, spatial data analysis tools.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21731708 Wavelet Transform and Support Vector Machine Approach for Fault Location in Power Transmission Line
Authors: V. Malathi, N.S.Marimuthu
Abstract:
This paper presents a wavelet transform and Support Vector Machine (SVM) based algorithm for estimating fault location on transmission lines. The Discrete wavelet transform (DWT) is used for data pre-processing and this data are used for training and testing SVM. Five types of mother wavelet are used for signal processing to identify a suitable wavelet family that is more appropriate for use in estimating fault location. The results demonstrated the ability of SVM to generalize the situation from the provided patterns and to accurately estimate the location of faults with varying fault resistance.Keywords: Fault location, support vector machine, supportvector regression, transmission lines, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21841707 Early Diagnosis of Alzheimer's Disease Using a Combination of Images Processing and Brain Signals
Authors: E. Irankhah, M. Zarif, E. Mazrooei Rad, K. Ghandehari
Abstract:
Alzheimer's prevalence is on the rise, and the disease comes with problems like cessation of treatment, high cost of treatment, and the lack of early detection methods. The pathology of this disease causes the formation of protein deposits in the brain of patients called plaque amyloid. Generally, the diagnosis of this disease is done by performing tests such as a cerebrospinal fluid, CT scan, MRI, and spinal cord fluid testing, or mental testing tests and eye tracing tests. In this paper, we tried to use the Medial Temporal Atrophy (MTA) method and the Leave One Out (LOO) cycle to extract the statistical properties of the three Fz, Pz, and Cz channels of ERP signals for early diagnosis of this disease. In the process of CT scan images, the accuracy of the results is 81% for the healthy person and 88% for the severe patient. After the process of ERP signaling, the accuracy of the results for a healthy person in the delta band in the Cz channel is 81% and in the alpha band the Pz channel is 90%. In the results obtained from the signal processing, the results of the severe patient in the delta band of the Cz channel were 89% and in the alpha band Pz channel 92%.
Keywords: Alzheimer's disease, image and signal processing, medial temporal atrophy, LOO Cycle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20501706 Cognitive SATP for Airborne Radar Based on Slow-Time Coding
Authors: Fanqiang Kong, Jindong Zhang, Daiyin Zhu
Abstract:
Space-time adaptive processing (STAP) techniques have been motivated as a key enabling technology for advanced airborne radar applications. In this paper, the notion of cognitive radar is extended to STAP technique, and cognitive STAP is discussed. The principle for improving signal-to-clutter ratio (SCNR) based on slow-time coding is given, and the corresponding optimization algorithm based on cyclic and power-like algorithms is presented. Numerical examples show the effectiveness of the proposed method.Keywords: Space-time adaptive processing (STAP), signal-to-clutter ratio, slow-time coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8531705 Non-destructive Watermelon Ripeness Determination Using Image Processing and Artificial Neural Network (ANN)
Authors: Shah Rizam M. S. B., Farah Yasmin A.R., Ahmad Ihsan M. Y., Shazana K.
Abstract:
Agriculture products are being more demanding in market today. To increase its productivity, automation to produce these products will be very helpful. The purpose of this work is to measure and determine the ripeness and quality of watermelon. The textures on watermelon skin will be captured using digital camera. These images will be filtered using image processing technique. All these information gathered will be trained using ANN to determine the watermelon ripeness accuracy. Initial results showed that the best model has produced percentage accuracy of 86.51%, when measured at 32 hidden units with a balanced percentage rate of training dataset.Keywords: Artificial Neural Network (ANN), Digital ImageProcessing, YCbCr Colour Space, Watermelon Ripeness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29521704 Development of a Thrust Measurement System
Authors: S. Jeon, J. Kim, H. Choi
Abstract:
KSLV-I(Korea Space Launch Vehicle-I) is designed as a launch vehicle to enter a 100 kg-class satellite to the LEO(Low Earth Orbit). Attitude angles of the upper-stage, including roll, pitch and yaw are controlled by the cold gas thruster system using nitrogen gas. The cold gas thruster is an actuator in the RCS(Reaction Control System). To design an attitude controller for the upper-stage, thrust measurement in vacuum condition is required. In this paper, the new thrust measurement system and calibration mechanism are developed and measurement errors and signal processing method are presented.Keywords: cold gas thruster, launch vehicle, thrust measurement, calibration mechanism, signal processing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27621703 Topological Queries on Graph-structured XML Data: Models and Implementations
Authors: Hongzhi Wang, Jianzhong Li, Jizhou Luo
Abstract:
In many applications, data is in graph structure, which can be naturally represented as graph-structured XML. Existing queries defined on tree-structured and graph-structured XML data mainly focus on subgraph matching, which can not cover all the requirements of querying on graph. In this paper, a new kind of queries, topological query on graph-structured XML is presented. This kind of queries consider not only the structure of subgraph but also the topological relationship between subgraphs. With existing subgraph query processing algorithms, efficient algorithms for topological query processing are designed. Experimental results show the efficiency of implementation algorithms.Keywords: XML, Graph Structure, Topological query.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14141702 Continuous FAQ Updating for Service Incident Ticket Resolution
Authors: Kohtaroh Miyamoto
Abstract:
As enterprise computing becomes more and more complex, the costs and technical challenges of IT system maintenance and support are increasing rapidly. One popular approach to managing IT system maintenance is to prepare and use a FAQ (Frequently Asked Questions) system to manage and reuse systems knowledge. Such a FAQ system can help reduce the resolution time for each service incident ticket. However, there is a major problem where over time the knowledge in such FAQs tends to become outdated. Much of the knowledge captured in the FAQ requires periodic updates in response to new insights or new trends in the problems addressed in order to maintain its usefulness for problem resolution. These updates require a systematic approach to define the exact portion of the FAQ and its content. Therefore, we are working on a novel method to hierarchically structure the FAQ and automate the updates of its structure and content. We use structured information and the unstructured text information with the timelines of the information in the service incident tickets. We cluster the tickets by structured category information, by keywords, and by keyword modifiers for the unstructured text information. We also calculate an urgency score based on trends, resolution times, and priorities. We carefully studied the tickets of one of our projects over a 2.5-year time period. After the first 6 months we started to create FAQs and confirmed they improved the resolution times. We continued observing over the next 2 years to assess the ongoing effectiveness of our method for the automatic FAQ updates. We improved the ratio of tickets covered by the FAQ from 32.3% to 68.9% during this time. Also, the average time reduction of ticket resolution was between 31.6% and 43.9%. Subjective analysis showed more than 75% reported that the FAQ system was useful in reducing ticket resolution times.
Keywords: FAQ System, Resolution Time, Service Incident Tickets, IT System Maintenance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24941701 Reverse Engineering of Agricultural Machinery: A Key to Food Sufficiency in Nigeria
Authors: Williams S. Ebhota, Virginia Chika Ebhota, Samuel A. Ilupeju
Abstract:
Agriculture employs about three-quarter of Nigeria's workforce and yet food sufficiency is a challenge in the country. This is largely due to poor and outdated pre-harvest and post-harvest farming practices. The land fallow system is still been practised as fertiliser production in the country is grossly inadequate and expensive. The few available post-harvest processing facilities are faced with ageing and are inefficient. Also, use of modern processing equipment is limited by farmers' lack of fund, adequate capacity to operate and maintain modern farming equipment. This paper, therefore, examines key barriers to agricultural products processing equipment in the country. These barriers include over-dependence on foreign technologies and expertise; poor and inadequate manufacturing infrastructure; and lack of political will by political leaders; lack of funds; and lack of adequate technical skills. This paper, however, sees the increase in the domestic manufacturing of pre-harvest and post-harvest machinery and equipment through reverse engineering approach as a key to food production sufficiency in Nigeria.
Keywords: Agricultural machinery, domestic manufacturing, forward engineering, production reverse engineering, technology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10741700 Performance Analysis of Chrominance Red and Chrominance Blue in JPEG
Authors: Mamta Garg
Abstract:
While compressing text files is useful, compressing still image files is almost a necessity. A typical image takes up much more storage than a typical text message and without compression images would be extremely clumsy to store and distribute. The amount of information required to store pictures on modern computers is quite large in relation to the amount of bandwidth commonly available to transmit them over the Internet and applications. Image compression addresses the problem of reducing the amount of data required to represent a digital image. Performance of any image compression method can be evaluated by measuring the root-mean-square-error & peak signal to noise ratio. The method of image compression that will be analyzed in this paper is based on the lossy JPEG image compression technique, the most popular compression technique for color images. JPEG compression is able to greatly reduce file size with minimal image degradation by throwing away the least “important" information. In JPEG, both color components are downsampled simultaneously, but in this paper we will compare the results when the compression is done by downsampling the single chroma part. In this paper we will demonstrate more compression ratio is achieved when the chrominance blue is downsampled as compared to downsampling the chrominance red in JPEG compression. But the peak signal to noise ratio is more when the chrominance red is downsampled as compared to downsampling the chrominance blue in JPEG compression. In particular we will use the hats.jpg as a demonstration of JPEG compression using low pass filter and demonstrate that the image is compressed with barely any visual differences with both methods.Keywords: JPEG, Discrete Cosine Transform, Quantization, Color Space Conversion, Image Compression, Peak Signal to Noise Ratio & Compression Ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16771699 Comparison of Processing Conditions for Plasticized PVC and PVB
Authors: Michael Tupý, Jaroslav Císař, Pavel Mokrejš, Dagmar Měřínská, Alice Tesaříková-Svobodová
Abstract:
It is the worldwide problem that the recycled PVB is not recycled and it is wildly stored in landfills. However, PVB has similar chemical properties such as PVC. Moreover, both of these polymers are plasticized. Therefore, the study of thermal properties of plasticized PVC and the recycled PVB obtained by recycling of windshields is carried out. This work has done in order to find nondegradable processing conditions applicable for both polymers. Tested PVC contained 38% of plasticizer diisononyl phthalate (DINP) and PVB was plasticized with 28% of triethylene glycol, bis(2-ethylhexanoate) (3GO). The thermal and thermo-oxidative decomposition of both vinyl polymers are compared by calorimetric analysis and by tensile strength analysis.Keywords: Poly(vinyl chloride), Poly(vinyl butyral), Recycling, Reprocessing, Thermal analysis, Decomposition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53881698 Efficient Method for ECG Compression Using Two Dimensional Multiwavelet Transform
Authors: Morteza Moazami-Goudarzi, Mohammad H. Moradi, Ali Taheri
Abstract:
In this paper we introduce an effective ECG compression algorithm based on two dimensional multiwavelet transform. Multiwavelets offer simultaneous orthogonality, symmetry and short support, which is not possible with scalar two-channel wavelet systems. These features are known to be important in signal processing. Thus multiwavelet offers the possibility of superior performance for image processing applications. The SPIHT algorithm has achieved notable success in still image coding. We suggested applying SPIHT algorithm to 2-D multiwavelet transform of2-D arranged ECG signals. Experiments on selected records of ECG from MIT-BIH arrhythmia database revealed that the proposed algorithm is significantly more efficient in comparison with previously proposed ECG compression schemes.
Keywords: ECG signal compression, multi-rateprocessing, 2-D Multiwavelet, Prefiltering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20311697 Analysis of Control by Flattening of the Welded Tubes
Authors: Hannachi Med Tahar, H. Djebaili, B. Daheche
Abstract:
In this approach, we have tried to describe the flattening of welded tubes, and its experimental application. The test is carried out at the (National product processing company dishes and tubes production). Usually, the final products (tubes) undergo a series of non-destructive inspection online and offline welding, and obviously destructive mechanical testing (bending, flattening, flaring, etc.). For this and for the purpose of implementing the flattening test, which applies to the processing of round tubes in other forms, it took four sections of welded tubes draft (before stretching hot) and welded tubes finished (after drawing hot and annealing), it was also noted the report 'health' flattened tubes must not show or crack or tear. The test is considered poor if it reveals a lack of ductility of the metal.
Keywords: Flattening, destructive testing, tube drafts, finished tube, Castem 2001.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12761696 Author Profiling: Prediction of Learners’ Gender on a MOOC Platform Based on Learners’ Comments
Authors: Tahani Aljohani, Jialin Yu, Alexandra. I. Cristea
Abstract:
The more an educational system knows about a learner, the more personalised interaction it can provide, which leads to better learning. However, asking a learner directly is potentially disruptive, and often ignored by learners. Especially in the booming realm of MOOC Massive Online Learning platforms, only a very low percentage of users disclose demographic information about themselves. Thus, in this paper, we aim to predict learners’ demographic characteristics, by proposing an approach using linguistically motivated Deep Learning Architectures for Learner Profiling, particularly targeting gender prediction on a FutureLearn MOOC platform. Additionally, we tackle here the difficult problem of predicting the gender of learners based on their comments only – which are often available across MOOCs. The most common current approaches to text classification use the Long Short-Term Memory (LSTM) model, considering sentences as sequences. However, human language also has structures. In this research, rather than considering sentences as plain sequences, we hypothesise that higher semantic - and syntactic level sentence processing based on linguistics will render a richer representation. We thus evaluate, the traditional LSTM versus other bleeding edge models, which take into account syntactic structure, such as tree-structured LSTM, Stack-augmented Parser-Interpreter Neural Network (SPINN) and the Structure-Aware Tag Augmented model (SATA). Additionally, we explore using different word-level encoding functions. We have implemented these methods on Our MOOC dataset, which is the most performant one comparing with a public dataset on sentiment analysis that is further used as a cross-examining for the models' results.
Keywords: Deep learning, data mining, gender predication, MOOCs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13631695 A Novel Web Metric for the Evaluation of Internet Trends
Authors: Radek Malinský, Ivan Jelínek
Abstract:
Web 2.0 (social networking, blogging and online forums) can serve as a data source for social science research because it contains vast amount of information from many different users. The volume of that information has been growing at a very high rate and becoming a network of heterogeneous data; this makes things difficult to find and is therefore not almost useful. We have proposed a novel theoretical model for gathering and processing data from Web 2.0, which would reflect semantic content of web pages in better way. This article deals with the analysis part of the model and its usage for content analysis of blogs. The introductory part of the article describes methodology for the gathering and processing data from blogs. The next part of the article is focused on the evaluation and content analysis of blogs, which write about specific trend.Keywords: Blog, Sentiment Analysis, Web 2.0, Webometrics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3544