Search results for: and processing time.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7604

Search results for: and processing time.

7004 A Novel Application of Network Equivalencing Method in Time Domain to Precise Calculation of Dead Time in Power Transmission Title

Authors: J. Moshtagh, L. Eslami

Abstract:

Various studies have showed that about 90% of single line to ground faults occurred on High voltage transmission lines have transient nature. This type of faults is cleared by temporary outage (by the single phase auto-reclosure). The interval between opening and reclosing of the faulted phase circuit breakers is named “Dead Time” that is varying about several hundred milliseconds. For adjustment of traditional single phase auto-reclosures that usually are not intelligent, it is necessary to calculate the dead time in the off-line condition precisely. If the dead time used in adjustment of single phase auto-reclosure is less than the real dead time, the reclosing of circuit breakers threats the power systems seriously. So in this paper a novel approach for precise calculation of dead time in power transmission lines based on the network equivalencing in time domain is presented. This approach has extremely higher precision in comparison with the traditional method based on Thevenin equivalent circuit. For comparison between the proposed approach in this paper and the traditional method, a comprehensive simulation by EMTP-ATP is performed on an extensive power network.

Keywords: Dead Time, Network Equivalencing, High Voltage Transmission Lines, Single Phase Auto-Reclosure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1564
7003 Design of Real Time Early Response Systems for Natural Disaster Management Based On Automation and Control Technologies

Authors: C. Pacheco, A. Cipriano

Abstract:

A new concept of response system is proposed for filling the gap that exists in reducing vulnerability during immediate response to natural disasters. Real Time Early Response Systems (RTERSs) incorporate real time information as feedback data for closing control loop and for generating real time situation assessment. A review of the state of the art on works that fit the concept of RTERS is presented, and it is found that they are mainly focused on manmade disasters. At the same time, in response phase of natural disaster management many works are involved in creating early warning systems, but just few efforts have been put on deciding what to do once an alarm is activated. In this context a RTERS arises as a useful tool for supporting people in their decision making process during natural disasters after an event is detected, and also as an innovative context for applying well-known automation technologies and automatic control concepts and tools.

Keywords: Disaster management, emergency response system, natural disasters, real time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3567
7002 A Comparison of Software Analysis and Design Methods for Real Time Systems

Authors: Anthony Spiteri Staines

Abstract:

This paper examines and compares several of the most common real time methods. These methods are CORE, YSM, MASCOT, JSD, DARTS, RTSAD, ADARTS, CODARTS, HOOD, HRT-HOOD, ROOM, UML, UML-RT. The methods are compared using attributes like i) usability, ii) compositionality and iii) proper RT notations available. Finally some comparison results are given and discussed.

Keywords: Software Engineering Methods, MethodComparison, Real Time Analysis and Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3676
7001 Review of Surface Electromyogram Signals: Its Analysis and Applications

Authors: Anjana Goen, D. C. Tiwari

Abstract:

Electromyography (EMG) is the study of muscles function through analysis of electrical activity produced from muscles. This electrical activity which is displayed in the form of signal is the result of neuromuscular activation associated with muscle contraction. The most common techniques of EMG signal recording are by using surface and needle/wire electrode where the latter is usually used for interest in deep muscle. This paper will focus on surface electromyogram (SEMG) signal. During SEMG recording, several problems had to been countered such as noise, motion artifact and signal instability. Thus, various signal processing techniques had been implemented to produce a reliable signal for analysis. SEMG signal finds broad application particularly in biomedical field. It had been analyzed and studied for various interests such as neuromuscular disease, enhancement of muscular function and human-computer interface.

Keywords: Evolvable hardware (EHW), Functional Electrical Simulation (FES), Hidden Markov Model (HMM), Hjorth Time Domain (HTD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3504
7000 64 bit Computer Architectures for Space Applications – A study

Authors: Niveditha Domse, Kris Kumar, K. N. Balasubramanya Murthy

Abstract:

The more recent satellite projects/programs makes extensive usage of real – time embedded systems. 16 bit processors which meet the Mil-Std-1750 standard architecture have been used in on-board systems. Most of the Space Applications have been written in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are needed in the area of spacecraft computing and therefore an effort is desirable in the study and survey of 64 bit architectures for space applications. This will also result in significant technology development in terms of VLSI and software tools for ADA (as the legacy code is in ADA). There are several basic requirements for a special processor for this purpose. They include Radiation Hardened (RadHard) devices, very low power dissipation, compatibility with existing operational systems, scalable architectures for higher computational needs, reliability, higher memory and I/O bandwidth, predictability, realtime operating system and manufacturability of such processors. Further on, these may include selection of FPGA devices, selection of EDA tool chains, design flow, partitioning of the design, pin count, performance evaluation, timing analysis etc. This project deals with a brief study of 32 and 64 bit processors readily available in the market and designing/ fabricating a 64 bit RISC processor named RISC MicroProcessor with added functionalities of an extended double precision floating point unit and a 32 bit signal processing unit acting as co-processors. In this paper, we emphasize the ease and importance of using Open Core (OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as Icarus to develop FPGA based prototypes quickly. Commercial tools such as Xilinx ISE for Synthesis are also used when appropriate.

Keywords: RISC MicroProcessor, RPC – RISC Processor Core, PBX – Processor to Block Interface part of the Interconnection Network, BPX – Block to Processor Interface part of the Interconnection Network, FPU – Floating Point Unit, SPU – Signal Processing Unit, WB – Wishbone Interface, CTU – Clock and Test Unit

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2236
6999 A Survey on Lossless Compression of Bayer Color Filter Array Images

Authors: Alina Trifan, António J. R. Neves

Abstract:

Although most digital cameras acquire images in a raw format, based on a Color Filter Array that arranges RGB color filters on a square grid of photosensors, most image compression techniques do not use the raw data; instead, they use the rgb result of an interpolation algorithm of the raw data. This approach is inefficient and by performing a lossless compression of the raw data, followed by pixel interpolation, digital cameras could be more power efficient and provide images with increased resolution given that the interpolation step could be shifted to an external processing unit. In this paper, we conduct a survey on the use of lossless compression algorithms with raw Bayer images. Moreover, in order to reduce the effect of the transition between colors that increase the entropy of the raw Bayer image, we split the image into three new images corresponding to each channel (red, green and blue) and we study the same compression algorithms applied to each one individually. This simple pre-processing stage allows an improvement of more than 15% in predictive based methods.

Keywords: Bayer images, CFA, losseless compression, image coding standards.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2532
6998 Simultaneous Saccharification and Fermentation(SSF) of Sugarcane Bagasse - Kinetics and Modeling

Authors: E.Sasikumar, T.Viruthagiri

Abstract:

Simultaneous Saccharification and Fermentation (SSF) of sugarcane bagasse by cellulase and Pachysolen tannophilus MTCC *1077 were investigated in the present study. Important process variables for ethanol production form pretreated bagasse were optimized using Response Surface Methodology (RSM) based on central composite design (CCD) experiments. A 23 five level CCD experiments with central and axial points was used to develop a statistical model for the optimization of process variables such as incubation temperature (25–45°) X1, pH (5.0–7.0) X2 and fermentation time (24–120 h) X3. Data obtained from RSM on ethanol production were subjected to the analysis of variance (ANOVA) and analyzed using a second order polynomial equation and contour plots were used to study the interactions among three relevant variables of the fermentation process. The fermentation experiments were carried out using an online monitored modular fermenter 2L capacity. The processing parameters setup for reaching a maximum response for ethanol production was obtained when applying the optimum values for temperature (32°C), pH (5.6) and fermentation time (110 h). Maximum ethanol concentration (3.36 g/l) was obtained from 50 g/l pretreated sugarcane bagasse at the optimized process conditions in aerobic batch fermentation. Kinetic models such as Monod, Modified Logistic model, Modified Logistic incorporated Leudeking – Piret model and Modified Logistic incorporated Modified Leudeking – Piret model have been evaluated and the constants were predicted.

Keywords: Sugarcane bagasse, ethanol, optimization, Pachysolen tannophilus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2287
6997 Optimum Time Coordination of Overcurrent Relays using Two Phase Simplex Method

Authors: Prashant P. Bedekar, Sudhir R. Bhide, Vijay S. Kale

Abstract:

Overcurrent (OC) relays are the major protection devices in a distribution system. The operating time of the OC relays are to be coordinated properly to avoid the mal-operation of the backup relays. The OC relay time coordination in ring fed distribution networks is a highly constrained optimization problem which can be stated as a linear programming problem (LPP). The purpose is to find an optimum relay setting to minimize the time of operation of relays and at the same time, to keep the relays properly coordinated to avoid the mal-operation of relays. This paper presents two phase simplex method for optimum time coordination of OC relays. The method is based on the simplex algorithm which is used to find optimum solution of LPP. The method introduces artificial variables to get an initial basic feasible solution (IBFS). Artificial variables are removed using iterative process of first phase which minimizes the auxiliary objective function. The second phase minimizes the original objective function and gives the optimum time coordination of OC relays.

Keywords: Constrained optimization, LPP, Overcurrent relaycoordination, Two-phase simplex method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2995
6996 Effect of Processing on Sensory Characteristics and Chemical Composition of Cottonseed (Gossypium hirsutum) and Its Extract

Authors: Olufunke O. Ezekiel, Abiodun A. Oriku

Abstract:

The seeds of cotton (Gossypium hirsutum) fall among the lesser known oil seeds. Cottonseeds are not normally consumed in their natural state due to their gossypol content, an antinutrient. The effect of processing on the sensory characteristics and chemical composition of cottonseed and its extract was studied by subjecting the cottonseed extract to heat treatment (boiling) and the cottonseed to fermentation. The cottonseed extract was boiled using the open pot and the pressure pot for 30 minutes respectively. The fermentation of the cottonseed was carried out for 6 days with samples withdrawn at intervals of 2 days. The extract and fermented samples were subjected to chemical analysis and sensory evaluated for colour, aroma, taste, mouth feel, appearance and overallacceptability. The open pot sample was more preferred. Fermentation for 6 days resulted into a significant reduction in gossypol level of the cottonseed; however, sample fermented for 2 days was most preferred.

Keywords: Cottonseed, boiling, extract, fermentation, True protein.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2252
6995 Hardware Prototyping of an Efficient Encryption Engine

Authors: Muhammad I. Ibrahimy, Mamun B.I. Reaz, Khandaker Asaduzzaman, Sazzad Hussain

Abstract:

An approach to develop the FPGA of a flexible key RSA encryption engine that can be used as a standard device in the secured communication system is presented. The VHDL modeling of this RSA encryption engine has the unique characteristics of supporting multiple key sizes, thus can easily be fit into the systems that require different levels of security. A simple nested loop addition and subtraction have been used in order to implement the RSA operation. This has made the processing time faster and used comparatively smaller amount of space in the FPGA. The hardware design is targeted on Altera STRATIX II device and determined that the flexible key RSA encryption engine can be best suited in the device named EP2S30F484C3. The RSA encryption implementation has made use of 13,779 units of logic elements and achieved a clock frequency of 17.77MHz. It has been verified that this RSA encryption engine can perform 32-bit, 256-bit and 1024-bit encryption operation in less than 41.585us, 531.515us and 790.61us respectively.

Keywords: RSA, FPGA, Communication, Security, VHDL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1439
6994 Graphene Oxide Fiber with Different Exfoliation Time and Activated Carbon Particle

Authors: Nuray Uçar, Mervin Ölmez, Özge Alptoğa, Nilgün K. Yavuz, Ayşen Önen

Abstract:

In recent years, research on continuous graphene oxide fibers has been intensified. Therefore, many factors of production stages are being studied. In this study, the effect of exfoliation time and presence of activated carbon particle (ACP) on graphene oxide fiber’s properties has been analyzed. It has been seen that cross-sectional appearance of sample with ACP is harsh and porous because of ACP. The addition of ACP did not change the electrical conductivity. However, ACP results in an enormous decrease of mechanical properties. Longer exfoliation time results to higher crystallinity degree, C/O ratio and less d space between layers. The breaking strength and electrical conductivity of sample with less exfoliation time is some higher than sample with high exfoliation time.

Keywords: Activated carbon, coagulation by wet spinning, exfoliation, graphene oxide fiber.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
6993 Effective Software-Based Solution for Processing Mass Downstream Data in Interactive Push VOD System

Authors: Ni Hong, Wu Guobin, Wu Gang, Pan Liang

Abstract:

Interactive push VOD system is a new kind of system that incorporates push technology and interactive technique. It can push movies to users at high speeds at off-peak hours for optimal network usage so as to save bandwidth. This paper presents effective software-based solution for processing mass downstream data at terminals of interactive push VOD system, where the service can download movie according to a viewer-s selection. The downstream data is divided into two catalogs: (1) the carousel data delivered according to DSM-CC protocol; (2) IP data delivered according to Euro-DOCSIS protocol. In order to accelerate download speed and reduce data loss rate at terminals, this software strategy introduces caching, multi-thread and resuming mechanisms. The experiments demonstrate advantages of the software-based solution.

Keywords: DSM-CC, data carousel, Euro-DOCSIS, push VOD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
6992 Computational Intelligence Hybrid Learning Approach to Time Series Forecasting

Authors: Chunshien Li, Jhao-Wun Hu, Tai-Wei Chiang, Tsunghan Wu

Abstract:

Time series forecasting is an important and widely popular topic in the research of system modeling. This paper describes how to use the hybrid PSO-RLSE neuro-fuzzy learning approach to the problem of time series forecasting. The PSO algorithm is used to update the premise parameters of the proposed prediction system, and the RLSE is used to update the consequence parameters. Thanks to the hybrid learning (HL) approach for the neuro-fuzzy system, the prediction performance is excellent and the speed of learning convergence is much faster than other compared approaches. In the experiments, we use the well-known Mackey-Glass chaos time series. According to the experimental results, the prediction performance and accuracy in time series forecasting by the proposed approach is much better than other compared approaches, as shown in Table IV. Excellent prediction performance by the proposed approach has been observed.

Keywords: forecasting, hybrid learning (HL), Neuro-FuzzySystem (NFS), particle swarm optimization (PSO), recursiveleast-squares estimator (RLSE), time series

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1547
6991 Contextual SenSe Model: Word Sense Disambiguation Using Sense and Sense Value of Context Surrounding the Target

Authors: Vishal Raj, Noorhan Abbas

Abstract:

Ambiguity in NLP (Natural Language Processing) refers to the ability of a word, phrase, sentence, or text to have multiple meanings. This results in various kinds of ambiguities such as lexical, syntactic, semantic, anaphoric and referential. This study is focused mainly on solving the issue of Lexical ambiguity. Word Sense Disambiguation (WSD) is an NLP technique that aims to resolve lexical ambiguity by determining the correct meaning of a word within a given context. Most WSD solutions rely on words for training and testing, but we have used lemma and Part of Speech (POS) tokens of words for training and testing. Lemma adds generality and POS adds properties of word into token. We have designed a method to create an affinity matrix to calculate the affinity between any pair of lemma_POS (a token where lemma and POS of word are joined by underscore) of given training set. Additionally, we have devised an algorithm to create the sense clusters of tokens using affinity matrix under hierarchy of POS of lemma. Furthermore, three different mechanisms to predict the sense of target word using the affinity/similarity value are devised. Each contextual token contributes to the sense of target word with some value and whichever sense gets higher value becomes the sense of target word. So, contextual tokens play a key role in creating sense clusters and predicting the sense of target word, hence, the model is named Contextual SenSe Model (CSM). CSM exhibits a noteworthy simplicity and explication lucidity in contrast to contemporary deep learning models characterized by intricacy, time-intensive processes, and challenging explication. CSM is trained on SemCor training data and evaluated on SemEval test dataset. The results indicate that despite the naivety of the method, it achieves promising results when compared to the Most Frequent Sense (MFS) model.

Keywords: Word Sense Disambiguation, WSD, Contextual SenSe Model, Most Frequent Sense, part of speech, POS, Natural Language Processing, NLP, OOV, out of vocabulary, ELMo, Embeddings from Language Model, BERT, Bidirectional Encoder Representations from Transformers, Word2Vec, lemma_POS, Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 303
6990 Dynamic Web-Based 2D Medical Image Visualization and Processing Software

Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail

Abstract:

In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.

Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 778
6989 Research on Weakly Hard Real-Time Constraints and Their Boolean Combination to Support Adaptive QoS

Authors: Xiangbin Zhu

Abstract:

Advances in computing applications in recent years have prompted the demand for more flexible scheduling models for QoS demand. Moreover, in practical applications, partly violated temporal constraints can be tolerated if the violation meets certain distribution. So we need extend the traditional Liu and Lanland model to adapt to these circumstances. There are two extensions, which are the (m, k)-firm model and Window-Constrained model. This paper researches on weakly hard real-time constraints and their combination to support QoS. The fact that a practical application can tolerate some violations of temporal constraint under certain distribution is employed to support adaptive QoS on the open real-time system. The experiment results show these approaches are effective compared to traditional scheduling algorithms.

Keywords: Weakly Hard Real-Time, Real-Time, Scheduling, Quality of Service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
6988 MJPEG Real-Time Transmission in Industrial Environments Using a CBR Channel

Authors: J. Silvestre, L. Almeida, R. Marau, P. Pedreiras

Abstract:

Currently, there are many local area industrial networks that can give guaranteed bandwidth to synchronous traffic, particularly providing CBR channels (Constant Bit Rate), which allow improved bandwidth management. Some of such networks operate over Ethernet, delivering channels with enough capacity, specially with compressors, to integrate multimedia traffic in industrial monitoring and image processing applications with many sources. In these industrial environments where a low latency is an essential requirement, JPEG is an adequate compressing technique but it generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic in CBR channels is inefficient and current solutions to this problem significantly increase the latency or further degrade the quality. In this paper an R(q) model is used which allows on-line calculation of the JPEG quantification factor. We obtained increased quality, a lower requirement for the CBR channel with reduced number of discarded frames along with better use of the channel bandwidth.

Keywords: Industrial Networks, Multimedia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
6987 Effect of Smoke Drying Techniques on the Proximate and Mineral Composition of Macrobrachium vollenhovenii (African River Prawn)

Authors: D. E. Omomo, R. M. Sunday, I. Kareem

Abstract:

This study was carried out to evaluate the nutritional composition of the African River Prawn (Macrobrachium vollenhovenii) in relation to Chokor (traditional) and Altona (improved traditional) drying techniques used in the preservation and processing of prawns by carrying out proximate composition analysis. The value obtained for the proximate analysis of Chokor and Altona smoke dried prawns were; Moisture (14.90% and 15.15%), Dry matter (85.10% and 84.85%), Protein (55.80% and 58.87%), Crude fat (1.95% and 1.98%), Crude fibre (21.40% and 13.11%), Carbohydrate (0.54% and 0.54%) and Ash (19.76% and 15.86%) respectively. The proximate mineral composition of Chokor and Altona smoke dried prawns were; Calcium (5.66% and 4.20%) and Phosphorus (9. 22% and 6.34%) respectively. Result shows there was no loss of nutritional value with respect to Chokor and Altona drying techniques used in the processing of prawns.

Keywords: Altona, Chokor, Macrobrachium vollenhovenii, Proximate composition, Smoke drying.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039
6986 Revealing Nonlinear Couplings between Oscillators from Time Series

Authors: B.P. Bezruchko, D.A. Smirnov

Abstract:

Quantitative characterization of nonlinear directional couplings between stochastic oscillators from data is considered. We suggest coupling characteristics readily interpreted from a physical viewpoint and their estimators. An expression for a statistical significance level is derived analytically that allows reliable coupling detection from a relatively short time series. Performance of the technique is demonstrated in numerical experiments.

Keywords: Nonlinear time series analysis, directional couplings, coupled oscillators.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1246
6985 Existence and Stability Analysis of Discrete-time Fuzzy BAM Neural Networks with Delays and Impulses

Authors: Chao Wang, Yongkun Li

Abstract:

In this paper, the discrete-time fuzzy BAM neural network with delays and impulses is studied. Sufficient conditions are obtained for the existence and global stability of a unique equilibrium of this class of fuzzy BAM neural networks with Lipschitzian activation functions without assuming their boundedness, monotonicity or differentiability and subjected to impulsive state displacements at fixed instants of time. Some numerical examples are given to demonstrate the effectiveness of the obtained results.

Keywords: Discrete-time fuzzy BAM neural networks, ımpulses, global exponential stability, global asymptotical stability, equilibrium point.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
6984 Detecting and Measuring Fabric Pills Using Digital Image Analysis

Authors: Dariush Semnani, Hossein Ghayoor

Abstract:

In this paper a novel method was presented for evaluating the fabric pills using digital image processing techniques. This work provides a novel technique for detecting pills and also measuring their heights, surfaces and volumes. Surely, measuring the intensity of defects by human vision is an inaccurate method for quality control; as a result, this problem became a motivation for employing digital image processing techniques for detection of defects of fabric surface. In the former works, the systems were just limited to measuring of the surface of defects, but in the presented method the height and the volume of defects were also measured, which leads to a more accurate quality control. An algorithm was developed to first, find pills and then measure their average intensity by using three criteria of height, surface and volume. The results showed a meaningful relation between the number of rotations and the quality of pilled fabrics.

Keywords: 3D analysis, computer vision, fabric, pile, surface evaluation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2608
6983 Microwave Pretreatment of Seeds to Extract High Quality Vegetable Oil

Authors: S. Azadmard-Damirchi, K. Alirezalu, B. Fathi Achachlouei

Abstract:

Microwave energy is a superior alternative to several other thermal treatments. Extraction techniques are widely employed for the isolation of bioactive compounds and vegetable oils from oil seeds. Among the different and new available techniques, microwave pretreatment of seeds is a simple and desirable method for production of high quality vegetable oils. Microwave pretreatment for oil extraction has many advantages as follow: improving oil extraction yield and quality, direct extraction capability, lower energy consumption, faster processing time and reduced solvent levels compared with conventional methods. It allows also for better retention and availability of desirable nutraceuticals, such as phytosterols and tocopherols, canolol and phenolic compounds in the extracted oil such as rapeseed oil. This can be a new step to produce nutritional vegetable oils with improved shelf life because of high antioxidant content.

Keywords: Microwave pretreatment, vegetable oil extraction, nutraceuticals, oil quality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4890
6982 Parallel Text Processing: Alignment of Indonesian to Javanese Language

Authors: Aji P. Wibawa, Andrew Nafalski, Neil Murray, Wayan F. Mahmudy

Abstract:

Parallel text alignment is proposed as a way of aligning bahasa Indonesia to words in Javanese. Since the one-to-one word translator does not have the facility to translate pragmatic aspects of Javanese, the parallel text alignment model described uses a phrase pair combination. The algorithm aligns the parallel text automatically from the beginning to the end of each sentence. Even though the results of the phrase pair combination outperform the previous algorithm, it is still inefficient. Recording all possible combinations consume more space in the database and time consuming. The original algorithm is modified by applying the edit distance coefficient to improve the data-storage efficiency. As a result, the data-storage consumption is 90% reduced as well as its learning period (42s).

Keywords: Parallel text alignment, phrase pair combination, edit distance coefficient, Javanese-Indonesian language.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2471
6981 Towards a Suitable and Systematic Approach for Component Based Software Development

Authors: Kuljit Kaur, Parminder Kaur, Jaspreet Bedi, Hardeep Singh

Abstract:

Software crisis refers to the situation in which the developers are not able to complete the projects within time and budget constraints and moreover these overscheduled and over budget projects are of low quality as well. Several methodologies have been adopted form time to time to overcome this situation and now in the focus is component based software engineering. In this approach, emphasis is on reuse of already existing software artifacts. But the results can not be achieved just by preaching the principles; they need to be practiced as well. This paper highlights some of the very basic elements of this approach, which has to be in place to get the desired goals of high quality, low cost with shorter time-to-market software products.

Keywords: Component Model, Software Components, SoftwareRepository, Process Models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
6980 Improvement in Properties of Ni-Cr-Mo-V Steel through Process Control

Authors: Arnab Majumdar, Sanjoy Sadhukhan

Abstract:

Although gun barrel steels are an important variety from defense view point, available literatures are very limited. In the present work, an IF grade Ni-Cr-Mo-V high strength low alloy steel is produced in Electric Earth Furnace-ESR Route. Ingot was hot forged to desired dimension with a reduction ratio of 70-75% followed by homogenization, hardening and tempering treatment. Sample chemistry, NMIR, macro and micro structural analyses were done. Mechanical properties which include tensile, impact, and fracture toughness were studied. Ultrasonic testing was done to identify internal flaws. The existing high strength low alloy Ni-Cr-Mo-V steel shows improved properties in modified processing route and heat treatment schedule in comparison to properties noted earlier for manufacturing of gun barrels. The improvement in properties seems to withstand higher explosive loads with the same amount of steel in gun barrel application.

Keywords: Gun barrel steels, IF grade, physical properties, thermal and mechanical processing, mechanical properties, ultrasonic testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2413
6979 Impact of ISO 9000 on Time-based Performance: An Event Study

Authors: Chris K. Y. Lo, Andy C. L. Yeung, T. C. Edwin Cheng

Abstract:

ISO 9000 is the most popular and widely adopted meta-standard for quality and operational improvements. However, only limited empirical research has been conducted to examine the impact of ISO 9000 on operational performance based on objective and longitudinal data. To reveal any causal relationship between the adoption of ISO 9000 and operational performance, we examined the timing and magnitude of change in time-based performance as a result of ISO 9000 adoption. We analyzed the changes in operating cycle, inventory days, and account receivable days prior and after the implementation of ISO 9000 in 695 publicly listed manufacturing firms. We found that ISO 9000 certified firms shortened their operating cycle time by 5.28 days one year after the implementation of ISO 9000. In the long-run (3 years after certification), certified firms showed continuous improvement in time-based efficiency, and experienced a shorter operating cycle time of 11 days than that of non-certified firms. There was an average of 6.5% improvement in operating cycle time for ISO 9000 certified firms. Both inventory days and account receivable days showed similar significant improvements after the implementation of ISO 9000, too.

Keywords: ISO 9000, Operating Cycle, Time-based efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2051
6978 Modelling Conditional Volatility of Saving Rate by a Time-Varying Parameter Model

Authors: Katleho D. Makatjane, Kalebe M. Kalebe

Abstract:

The present paper used time-varying parameters which are based on the score function of a probability density at time t to model volatility of saving rate. We used a scaled likelihood function to update the parameters of the model overtime. Our results revealed high diligence of time-varying since the location parameter is greater than zero. Furthermore, we discovered a leptokurtic condition on saving rate’s distribution. Kapetanios, Shin-Shell Nonlinear Augmented Dickey-Fuller (KSS-NADF) test showed that the saving rate has a nonlinear unit root; therefore, it can be modeled by a generalised autoregressive score (GAS) model. Additionally, value at risk (VaR) and conditional tail expectation (CTE) indicate that 99% of the time people in Lesotho are saving more than spending. This puts the economy in high risk of not expanding. Therefore, the monetary policy committee (MPC) of Lesotho should revise their monetary policies towards this high saving rates risk.

Keywords: Generalized autoregressive score, time-varying, saving rate, Lesotho.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 604
6977 An Energy-Efficient Model of Integrating Telehealth IoT Devices with Fog and Cloud Computing-Based Platform

Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo

Abstract:

The rapid growth of telehealth Internet of Things (IoT) devices has raised concerns about energy consumption and efficient data processing. This paper presents an energy-efficient model that integrates telehealth IoT devices with a fog and cloud computing-based platform, offering a sustainable and robust solution to overcome these challenges. Our model employs fog computing as a localized data processing layer while leveraging cloud computing for resource-intensive tasks, significantly reducing energy consumption. We incorporate adaptive energy-saving strategies. Simulation analysis validates our approach's effectiveness in enhancing energy efficiency for telehealth IoT systems integrated with localized fog nodes and both private and public cloud infrastructures. Future research will focus on further optimization of the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability in other healthcare and industry sectors.

Keywords: Energy-efficient, fog computing, IoT, telehealth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 105
6976 LiDAR Based Real Time Multiple Vehicle Detection and Tracking

Authors: Zhongzhen Luo, Saeid Habibi, Martin v. Mohrenschildt

Abstract:

Self-driving vehicle require a high level of situational awareness in order to maneuver safely when driving in real world condition. This paper presents a LiDAR based real time perception system that is able to process sensor raw data for multiple target detection and tracking in dynamic environment. The proposed algorithm is nonparametric and deterministic that is no assumptions and priori knowledge are needed from the input data and no initializations are required. Additionally, the proposed method is working on the three-dimensional data directly generated by LiDAR while not scarifying the rich information contained in the domain of 3D. Moreover, a fast and efficient for real time clustering algorithm is applied based on a radially bounded nearest neighbor (RBNN). Hungarian algorithm procedure and adaptive Kalman filtering are used for data association and tracking algorithm. The proposed algorithm is able to run in real time with average run time of 70ms per frame.

Keywords: LiDAR, real-time system, clustering, tracking, data association.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4654
6975 A Semi- One Time Pad Using Blind Source Separation for Speech Encryption

Authors: Long Jye Sheu, Horng-Shing Chiou, Wei Ching Chen

Abstract:

We propose a new perspective on speech communication using blind source separation. The original speech is mixed with key signals which consist of the mixing matrix, chaotic signals and a random noise. However, parts of the keys (the mixing matrix and the random noise) are not necessary in decryption. In practice implement, one can encrypt the speech by changing the noise signal every time. Hence, the present scheme obtains the advantages of a One Time Pad encryption while avoiding its drawbacks in key exchange. It is demonstrated that the proposed scheme is immune against traditional attacks.

Keywords: one time pad, blind source separation, independentcomponent analysis, speech encryption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1556