Search results for: Accounting information quality
1587 Grouping-Based Job Scheduling Model In Grid Computing
Authors: Vishnu Kant Soni, Raksha Sharma, Manoj Kumar Mishra
Abstract:
Grid computing is a high performance computing environment to solve larger scale computational applications. Grid computing contains resource management, job scheduling, security problems, information management and so on. Job scheduling is a fundamental and important issue in achieving high performance in grid computing systems. However, it is a big challenge to design an efficient scheduler and its implementation. In Grid Computing, there is a need of further improvement in Job Scheduling algorithm to schedule the light-weight or small jobs into a coarse-grained or group of jobs, which will reduce the communication time, processing time and enhance resource utilization. This Grouping strategy considers the processing power, memory-size and bandwidth requirements of each job to realize the real grid system. The experimental results demonstrate that the proposed scheduling algorithm efficiently reduces the processing time of jobs in comparison to others.Keywords: Grid computing, Job grouping and Jobscheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19491586 Using Ultrasonic and Infrared Sensors for Distance Measurement
Authors: Tarek Mohammad
Abstract:
The amplitude response of infrared (IR) sensors depends on the reflectance properties of the target. Therefore, in order to use IR sensor for measuring distances accurately, prior knowledge of the surface must be known. This paper describes the Phong Illumination Model for determining the properties of a surface and subsequently calculating the distance to the surface. The angular position of the IR sensor is computed as normal to the surface for simplifying the calculation. Ultrasonic (US) sensor can provide the initial information on distance to obtain the parameters for this method. In addition, the experimental results obtained by using LabView are discussed. More care should be taken when placing the objects from the sensors during acquiring data since the small change in angle could show very different distance than the actual one. Since stereo camera vision systems do not perform well under some environmental conditions such as plain wall, glass surfaces, or poor lighting conditions, the IR and US sensors can be used additionally to improve the overall vision systems of mobile robots.Keywords: Distance Measurement, Infrared sensor, Surface properties, Ultrasonic sensor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150151585 Deterministic Random Number Generators for Online Applications
Authors: Natarajan Vijayarangan, Prasanna S. Bidare
Abstract:
Cryptography, Image watermarking and E-banking are filled with apparent oxymora and paradoxes. Random sequences are used as keys to encrypt information to be used as watermark during embedding the watermark and also to extract the watermark during detection. Also, the keys are very much utilized for 24x7x365 banking operations. Therefore a deterministic random sequence is very much useful for online applications. In order to obtain the same random sequence, we need to supply the same seed to the generator. Many researchers have used Deterministic Random Number Generators (DRNGs) for cryptographic applications and Pseudo Noise Random sequences (PNs) for watermarking. Even though, there are some weaknesses in PN due to attacks, the research community used it mostly in digital watermarking. On the other hand, DRNGs have not been widely used in online watermarking due to its computational complexity and non-robustness. Therefore, we have invented a new design of generating DRNG using Pi-series to make it useful for online Cryptographic, Digital watermarking and Banking applications.
Keywords: E-tokens, LFSR, non-linear, Pi series, pseudo random number.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20101584 Ethnobotany and Distribution of Dioscoreahispida Dennst. (Dioscoreaceae) in Besut, Marang and Setiu Districts of Terengganu, Peninsular Malaysia
Authors: M. Nashriyah, T. Salmah, M.Y. NurAtiqah, O. Siti Nor Indah, A.W. MuhamadAzhar, S. Munirah, Y. Nornasuha, A. Abdul Manaf
Abstract:
Dioscorea species or commonly named as yam is reported to be one of the major food sources worldwide. This ethnobotanical study was conducted to document local knowledge and potentials of DioscoreahispidaDennst. and to investigate and record its distribution in three districts of Terengganu. Information was gathered from 23 villagers from three districts of Besut, Marang and Setiu by using semi-structured questionnaire. The villagers were randomly selected and no appointment was made prior to the visits. For distribution, the location of Dioscoreahispida was recorded by using the Global Positioning System (GPS). The villagers identified Dioscoreahispida or locally named ubigadong by looking at the physical characteristics that include its leaf shape, stem and the color of the tuber-s flesh. The villagers used Dioscoreahispida in many ways in their life such as for food, medicinal purposes and fish poison.Keywords: Dioscoreahispida, ethnobotany, intoxicating yam, ubigadong, Terengganu.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17961583 Wavelet Feature Selection Approach for Heart Murmur Classification
Authors: G. Venkata Hari Prasad, P. Rajesh Kumar
Abstract:
Phonocardiography is important in appraisal of congenital heart disease and pulmonary hypertension as it reflects the duration of right ventricular systoles. The systolic murmur in patients with intra-cardiac shunt decreases as pulmonary hypertension develops and may eventually disappear completely as the pulmonary pressure reaches systemic level. Phonocardiography and auscultation are non-invasive, low-cost, and accurate methods to assess heart disease. In this work an objective signal processing tool to extract information from phonocardiography signal using Wavelet is proposed to classify the murmur as normal or abnormal. Since the feature vector is large, a Binary Particle Swarm Optimization (PSO) with mutation for feature selection is proposed. The extracted features improve the classification accuracy and were tested across various classifiers including Naïve Bayes, kNN, C4.5, and SVM.Keywords: Phonocardiography, Coiflet, Feature selection, Particle Swarm Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24731582 Application of Double Side Approach Method on Super Elliptical Winkler Plate
Authors: Hsiang-Wen Tang, Cheng-Ying Lo
Abstract:
In this study, the static behavior of super elliptical Winkler plate is analyzed by applying the double side approach method. The lack of information about super elliptical Winkler plates is the motivation of this study and we use the double side approach method to solve this problem because of its superior ability on efficiently treating problems with complex boundary shape. The double side approach method has the advantages of high accuracy, easy calculation procedure and less calculation load required. Most important of all, it can give the error bound of the approximate solution. The numerical results not only show that the double side approach method works well on this problem but also provide us the knowledge of static behavior of super elliptical Winkler plate in practical use.
Keywords: Super elliptical Winkler Plate, double side approach method, error bound.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16191581 A Visualized Framework for Representing Uncertain and Incomplete Temporal Knowledge
Authors: Yue Wang, Jixin Ma, Brian Knight
Abstract:
This paper presents a visualized computer aided case tool for non-expert, called Visual Time, for representing and reasoning about incomplete and uncertain temporal information. It is both expressive and versatile, allowing logical conjunctions and disjunctions of both absolute and relative temporal relations, such as “Before”, “Meets”, “Overlaps”, “Starts”, “During”, and “Finishes”, etc. In terms of a visualized framework, Visual Time provides a user-friendly environment for describing scenarios with rich temporal structure in natural language, which can be formatted as structured temporal phrases and modeled in terms of Temporal Relationship Diagrams (TRD). A TRD can be automatically and visually transformed into a corresponding Time Graph, supported by automatic consistency checker that derives a verdict to confirm if a given scenario is temporally consistent or inconsistent.
Keywords: Time Visualization, Uncertainty, Incompleteness, Consistency Checking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15131580 Overview Studies of High Strength Self-Consolidating Concrete
Authors: Raya Harkouss, Bilal Hamad
Abstract:
Self-Consolidating Concrete (SCC) is considered as a relatively new technology created as an effective solution to problems associated with low quality consolidation. A SCC mix is defined as successful if it flows freely and cohesively without the intervention of mechanical compaction. The construction industry is showing high tendency to use SCC in many contemporary projects to benefit from the various advantages offered by this technology.
At this point, a main question is raised regarding the effect of enhanced fluidity of SCC on the structural behavior of high strength self-consolidating reinforced concrete.
A three phase research program was conducted at the American University of Beirut (AUB) to address this concern. The first two phases consisted of comparative studies conducted on concrete and mortar mixes prepared with second generation Sulphonated Naphtalene-based superplasticizer (SNF) or third generation Polycarboxylate Ethers-based superplasticizer (PCE). The third phase of the research program investigates and compares the structural performance of high strength reinforced concrete beam specimens prepared with two different generations of superplasticizers that formed the unique variable between the concrete mixes. The beams were designed to test and exhibit flexure, shear, or bond splitting failure.
The outcomes of the experimental work revealed comparable resistance of beam specimens cast using self-compacting concrete and conventional vibrated concrete. The dissimilarities in the experimental values between the SCC and the control VC beams were minimal, leading to a conclusion, that the high consistency of SCC has little effect on the flexural, shear and bond strengths of concrete members.
Keywords: Self-consolidating concrete (SCC), high-strength concrete, concrete admixtures, mechanical properties of hardened SCC, structural behavior of reinforced concrete beams.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29711579 An Approach to Correlate the Statistical-Based Lorenz Method, as a Way of Measuring Heterogeneity, with Kozeny-Carman Equation
Authors: H. Khanfari, M. Johari Fard
Abstract:
Dealing with carbonate reservoirs can be mind-boggling for the reservoir engineers due to various digenetic processes that cause a variety of properties through the reservoir. A good estimation of the reservoir heterogeneity which is defined as the quality of variation in rock properties with location in a reservoir or formation, can better help modeling the reservoir and thus can offer better understanding of the behavior of that reservoir. Most of reservoirs are heterogeneous formations whose mineralogy, organic content, natural fractures, and other properties vary from place to place. Over years, reservoir engineers have tried to establish methods to describe the heterogeneity, because heterogeneity is important in modeling the reservoir flow and in well testing. Geological methods are used to describe the variations in the rock properties because of the similarities of environments in which different beds have deposited in. To illustrate the heterogeneity of a reservoir vertically, two methods are generally used in petroleum work: Dykstra-Parsons permeability variations (V) and Lorenz coefficient (L) that are reviewed briefly in this paper. The concept of Lorenz is based on statistics and has been used in petroleum from that point of view. In this paper, we correlated the statistical-based Lorenz method to a petroleum concept, i.e. Kozeny-Carman equation and derived the straight line plot of Lorenz graph for a homogeneous system. Finally, we applied the two methods on a heterogeneous field in South Iran and discussed each, separately, with numbers and figures. As expected, these methods show great departure from homogeneity. Therefore, for future investment, the reservoir needs to be treated carefully.
Keywords: Carbonate reservoirs, heterogeneity, homogeneous system, Dykstra-Parsons permeability variations (V), Lorenz coefficient (L).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17911578 FPGA Hardware Implementation and Evaluation of a Micro-Network Architecture for Multi-Core Systems
Authors: Yahia Salah, Med Lassaad Kaddachi, Rached Tourki
Abstract:
This paper presents the design, implementation and evaluation of a micro-network, or Network-on-Chip (NoC), based on a generic pipeline router architecture. The router is designed to efficiently support traffic generated by multimedia applications on embedded multi-core systems. It employs a simplest routing mechanism and implements the round-robin scheduling strategy to resolve output port contentions and minimize latency. A virtual channel flow control is applied to avoid the head-of-line blocking problem and enhance performance in the NoC. The hardware design of the router architecture has been implemented at the register transfer level; its functionality is evaluated in the case of the two dimensional Mesh/Torus topology, and performance results are derived from ModelSim simulator and Xilinx ISE 9.2i synthesis tool. An example of a multi-core image processing system utilizing the NoC structure has been implemented and validated to demonstrate the capability of the proposed micro-network architecture. To reduce complexity of the image compression and decompression architecture, the system use image processing algorithm based on classical discrete cosine transform with an efficient zonal processing approach. The experimental results have confirmed that both the proposed image compression scheme and NoC architecture can achieve a reasonable image quality with lower processing time.
Keywords: Generic Pipeline Network-on-Chip Router Architecture, JPEG Image Compression, FPGA Hardware Implementation, Performance Evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30971577 An Automatic Feature Extraction Technique for 2D Punch Shapes
Authors: Awais Ahmad Khan, Emad Abouel Nasr, H. M. A. Hussein, Abdulrahman Al-Ahmari
Abstract:
Sheet-metal parts have been widely applied in electronics, communication and mechanical industries in recent decades; but the advancement in sheet-metal part design and manufacturing is still behind in comparison with the increasing importance of sheet-metal parts in modern industry. This paper presents a methodology for automatic extraction of some common 2D internal sheet metal features. The features used in this study are taken from Unipunch ™ catalogue. The extraction process starts with the data extraction from STEP file using an object oriented approach and with the application of suitable algorithms and rules, all features contained in the catalogue are automatically extracted. Since the extracted features include geometry and engineering information, they will be effective for downstream application such as feature rebuilding and process planning.
Keywords: Feature Extraction, Internal Features, Punch Shapes, Sheet metal, STEP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20931576 An Approach for Transient Response Calculation of large Nonproportionally Damped Structures using Component Mode Synthesis
Authors: Alexander A. Muravyov
Abstract:
A minimal complexity version of component mode synthesis is presented that requires simplified computer programming, but still provides adequate accuracy for modeling lower eigenproperties of large structures and their transient responses. The novelty is that a structural separation into components is done along a plane/surface that exhibits rigid-like behavior, thus only normal modes of each component is sufficient to use, without computing any constraint, attachment, or residual-attachment modes. The approach requires only such input information as a few (lower) natural frequencies and corresponding undamped normal modes of each component. A novel technique is shown for formulation of equations of motion, where a double transformation to generalized coordinates is employed and formulation of nonproportional damping matrix in generalized coordinates is shown.Keywords: component mode synthesis, finite element models, transient response, nonproportional damping
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18051575 Power Efficient OFDM Signals with Reduced Symbol's Aperiodic Autocorrelation
Authors: Ibrahim M. Hussain
Abstract:
Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.
Keywords: Aperiodic autocorrelation, OFDM, PAPR, SLM, wireless communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17221574 A Dynamic RGB Intensity Based Steganography Scheme
Authors: Mandep Kaur, Surbhi Gupta, Parvinder S. Sandhu, Jagdeep Kaur
Abstract:
Steganography meaning covered writing. Steganography includes the concealment of information within computer files [1]. In other words, it is the Secret communication by hiding the existence of message. In this paper, we will refer to cover image, to indicate the images that do not yet contain a secret message, while we will refer to stego images, to indicate an image with an embedded secret message. Moreover, we will refer to the secret message as stego-message or hidden message. In this paper, we proposed a technique called RGB intensity based steganography model as RGB model is the technique used in this field to hide the data. The methods used here are based on the manipulation of the least significant bits of pixel values [3][4] or the rearrangement of colors to create least significant bit or parity bit patterns, which correspond to the message being hidden. The proposed technique attempts to overcome the problem of the sequential fashion and the use of stego-key to select the pixels.
Keywords: Steganography, Stego Image, RGB Image, Cryptography, LSB.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21111573 Hybrid Algorithm for Hammerstein System Identification Using Genetic Algorithm and Particle Swarm Optimization
Authors: Tomohiro Hachino, Kenji Shimoda, Hitoshi Takata
Abstract:
This paper presents a method of model selection and identification of Hammerstein systems by hybridization of the genetic algorithm (GA) and particle swarm optimization (PSO). An unknown nonlinear static part to be estimated is approximately represented by an automatic choosing function (ACF) model. The weighting parameters of the ACF and the system parameters of the linear dynamic part are estimated by the linear least-squares method. On the other hand, the adjusting parameters of the ACF model structure are properly selected by the hybrid algorithm of the GA and PSO, where the Akaike information criterion is utilized as the evaluation value function. Simulation results are shown to demonstrate the effectiveness of the proposed hybrid algorithm.Keywords: Hammerstein system, identification, automatic choosing function model, genetic algorithm, particle swarm optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15011572 Innovativeness, Risk Taking, Focusing on Opportunity Attitudes on Nurse Managers and Nurses
Authors: Melek Kalkan, Hatice Odacı, Hatice Epli Koç
Abstract:
The aim of this study is to compare the innovativeness, risk taking, and focusing on opportunity of the nurse managers and nurses. The data are collected from nurse managers and nurses in Ondokuz Mayıs University, Faculty of Medicine Hospital and Karadeniz Technical University, Faculty of Medicine Hospital. The study sample consisted of 151 participants, 76 nurse managers (50.3%) and 75 nurses (49.7%). All participants have been assessed by Participant Information Form and Corporate Entrepreneurship Scale. In data analysis, independent t-test has applied. The results show that there are significant differences between nurse managers and nurses on innovativeness (t = 2.42, p < 0.05), risk taking (t = 3.62, p < 0.01), and focusing on opportunity (t = 2.16, p < 0.05). Consequently, it can be said that nurse managers have more innovativeness than nurses and tend to take more risks and focus more on opportunities.
Keywords: Focusing on opportunity attitudes, innovativeness, risk taking, nurse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24781571 Life Cycle Assessment of Seawater Desalinization in Western Australia
Authors: Wahidul K. Biswas
Abstract:
Perth will run out of available sustainable natural water resources by 2015 if nothing is done to slow usage rates, according to a Western Australian study [1]. Alternative water technology options need to be considered for the long-term guaranteed supply of water for agricultural, commercial, domestic and industrial purposes. Seawater is an alternative source of water for human consumption, because seawater can be desalinated and supplied in large quantities to a very high quality. While seawater desalination is a promising option, the technology requires a large amount of energy which is typically generated from fossil fuels. The combustion of fossil fuels emits greenhouse gases (GHG) and, is implicated in climate change. In addition to environmental emissions from electricity generation for desalination, greenhouse gases are emitted in the production of chemicals and membranes for water treatment. Since Australia is a signatory to the Kyoto Protocol, it is important to quantify greenhouse gas emissions from desalinated water production. A life cycle assessment (LCA) has been carried out to determine the greenhouse gas emissions from the production of 1 gigalitre (GL) of water from the new plant. In this LCA analysis, a new desalination plant that will be installed in Bunbury, Western Australia, and known as Southern Seawater Desalinization Plant (SSDP), was taken as a case study. The system boundary of the LCA mainly consists of three stages: seawater extraction, treatment and delivery. The analysis found that the equivalent of 3,890 tonnes of CO2 could be emitted from the production of 1 GL of desalinated water. This LCA analysis has also identified that the reverse osmosis process would cause the most significant greenhouse emissions as a result of the electricity used if this is generated from fossil fuelsKeywords: Desalinization, Greenhouse gas emissions, life cycle assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41181570 Churn Prediction for Telecommunication Industry Using Artificial Neural Networks
Authors: Ulas Vural, M. Ergun Okay, E. Mesut Yildiz
Abstract:
Telecommunication service providers demand accurate and precise prediction of customer churn probabilities to increase the effectiveness of their customer relation services. The large amount of customer data owned by the service providers is suitable for analysis by machine learning methods. In this study, expenditure data of customers are analyzed by using an artificial neural network (ANN). The ANN model is applied to the data of customers with different billing duration. The proposed model successfully predicts the churn probabilities at 83% accuracy for only three months expenditure data and the prediction accuracy increases up to 89% when the nine month data is used. The experiments also show that the accuracy of ANN model increases on an extended feature set with information of the changes on the bill amounts.Keywords: Customer relationship management, churn prediction, telecom industry, deep learning, Artificial Neural Networks, ANN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7601569 Predictive Clustering Hybrid Regression(pCHR) Approach and Its Application to Sucrose-Based Biohydrogen Production
Authors: Nikhil, Ari Visa, Chin-Chao Chen, Chiu-Yue Lin, Jaakko A. Puhakka, Olli Yli-Harja
Abstract:
A predictive clustering hybrid regression (pCHR) approach was developed and evaluated using dataset from H2- producing sucrose-based bioreactor operated for 15 months. The aim was to model and predict the H2-production rate using information available about envirome and metabolome of the bioprocess. Selforganizing maps (SOM) and Sammon map were used to visualize the dataset and to identify main metabolic patterns and clusters in bioprocess data. Three metabolic clusters: acetate coupled with other metabolites, butyrate only, and transition phases were detected. The developed pCHR model combines principles of k-means clustering, kNN classification and regression techniques. The model performed well in modeling and predicting the H2-production rate with mean square error values of 0.0014 and 0.0032, respectively.Keywords: Biohydrogen, bioprocess modeling, clusteringhybrid regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17771568 Lifelong Education for Teachers: A Tool for Achieving Effective Teaching and Learning in Secondary Schools in Benue State, Nigeria
Authors: P. I. Adzongo, O. A. Aloga
Abstract:
The purpose of the study was to examine lifelong education for teachers as a tool for achieving effective teaching and learning. Lifelong education enhances social inclusion, personal development, citizenship, employability, teaching and learning, community and the nation. It is imperative that the teacher needs to update his knowledge regularly to be able to perform optimally, since he has a major position in the inculcation of desirable elements in students, and the challenges of lifelong education were also discussed. Descriptive survey design was adopted for the study. A simple random sampling technique was used to select 80 teachers as sample from a population of 105 senior secondary school teachers in Makurdi Local Government Area of Benue State. A 20-item self designed questionnaire subjected to expert validation and reliability was used to collect data. The reliability Alpha coefficient of 0.87 was established using Cronbach’s Alpha technique, mean scores and standard deviation were used to answer the 2 research questions while chi-square was used to analyse data for the 2 null hypotheses, which states that lifelong education for teachers is not a significant tool for achieving effective teaching and lifelong education for teachers does not significantly impact on effective learning. The findings of the study revealed that, lifelong education for teachers can be used as a tool for achieving effective teaching and learning, and the study recommended among others that government, organizations and individuals should in collaboration put lifelong education programmes for teachers on the priority list. The paper concluded that the strategic position of lifelong education for teachers towards enhanced teaching, learning and the production of quality manpower in the society makes it imperative for all hands to be on “deck” to support the programme financially and otherwise.Keywords: Lifelong Education, Tool, Effective Teaching and Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14671567 Analysis and Control of Camera Type Weft Straightener
Authors: Jae-Yong Lee, Gyu-Hyun Bae, Yun-Soo Chung, Dae-Sub Kim, Jae-Sung Bae
Abstract:
In general, fabric is heat-treated using a stenter machine in order to dry and fix its shape. It is important to shape before the heat treatment because it is difficult to revert back once the fabric is formed. To produce the product of right shape, camera type weft straightener has been applied recently to capture and process fabric images quickly. It is more powerful in determining the final textile quality rather than photo-sensor. Positioning in front of a stenter machine, weft straightener helps to spread fabric evenly and control the angle between warp and weft constantly as right angle by handling skew and bow rollers. To process this tricky procedure, the structural analysis should be carried out in advance, based on which, its control technology can be drawn. A structural analysis is to figure out the specific contact/slippage characteristics between fabric and roller. We already examined the applicability of camera type weft straightener to plain weave fabric and found its possibility and the specific working condition of machine and rollers. In this research, we aimed to explore another applicability of camera type weft straightener. Namely, we tried to figure out camera type weft straightener can be used for fabrics. To find out the optimum condition, we increased the number of rollers. The analysis is done by ANSYS software using Finite Element Analysis method. The control function is demonstrated by experiment. In conclusion, the structural analysis of weft straightener is done to identify a specific characteristic between roller and fabrics. The control of skew and bow roller is done to decrease the error of the angle between warp and weft. Finally, it is proved that camera type straightener can also be used for the special fabrics.
Keywords: Camera type weft straightener, structure analysis, control, skew and bow roller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14511566 Integrated Design in Additive Manufacturing Based on Design for Manufacturing
Authors: E. Asadollahi-Yazdi, J. Gardan, P. Lafon
Abstract:
Nowadays, manufactures are encountered with production of different version of products due to quality, cost and time constraints. On the other hand, Additive Manufacturing (AM) as a production method based on CAD model disrupts the design and manufacturing cycle with new parameters. To consider these issues, the researchers utilized Design For Manufacturing (DFM) approach for AM but until now there is no integrated approach for design and manufacturing of product through the AM. So, this paper aims to provide a general methodology for managing the different production issues, as well as, support the interoperability with AM process and different Product Life Cycle Management tools. The problem is that the models of System Engineering which is used for managing complex systems cannot support the product evolution and its impact on the product life cycle. Therefore, it seems necessary to provide a general methodology for managing the product’s diversities which is created by using AM. This methodology must consider manufacture and assembly during product design as early as possible in the design stage. The latest approach of DFM, as a methodology to analyze the system comprehensively, integrates manufacturing constraints in the numerical model in upstream. So, DFM for AM is used to import the characteristics of AM into the design and manufacturing process of a hybrid product to manage the criteria coming from AM. Also, the research presents an integrated design method in order to take into account the knowledge of layers manufacturing technologies. For this purpose, the interface model based on the skin and skeleton concepts is provided, the usage and manufacturing skins are used to show the functional surface of the product. Also, the material flow and link between the skins are demonstrated by usage and manufacturing skeletons. Therefore, this integrated approach is a helpful methodology for designer and manufacturer in different decisions like material and process selection as well as, evaluation of product manufacturability.
Keywords: Additive manufacturing, 3D printing, design for manufacturing, integrated design, interoperability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22571565 Optical Flow Based Moving Object Detection and Tracking for Traffic Surveillance
Authors: Sepehr Aslani, Homayoun Mahdavi-Nasab
Abstract:
Automated motion detection and tracking is a challenging task in traffic surveillance. In this paper, a system is developed to gather useful information from stationary cameras for detecting moving objects in digital videos. The moving detection and tracking system is developed based on optical flow estimation together with application and combination of various relevant computer vision and image processing techniques to enhance the process. To remove noises, median filter is used and the unwanted objects are removed by applying thresholding algorithms in morphological operations. Also the object type restrictions are set using blob analysis. The results show that the proposed system successfully detects and tracks moving objects in urban videos.
Keywords: Optical flow estimation, moving object detection, tracking, morphological operation, blob analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 101561564 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function
Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos
Abstract:
Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19681563 Goal-Based Request Cloud Resource Broker in Medical Application
Authors: Mohamad Izuddin Nordin, Azween Abdullah, Mahamat Issa Hassan
Abstract:
In this paper, cloud resource broker using goalbased request in medical application is proposed. To handle recent huge production of digital images and data in medical informatics application, the cloud resource broker could be used by medical practitioner for proper process in discovering and selecting correct information and application. This paper summarizes several reviewed articles to relate medical informatics application with current broker technology and presents a research work in applying goal-based request in cloud resource broker to optimize the use of resources in cloud environment. The objective of proposing a new kind of resource broker is to enhance the current resource scheduling, discovery, and selection procedures. We believed that it could help to maximize resources allocation in medical informatics application.Keywords: Broker, Cloud Computing, Medical Informatics, Resources Discovery, Resource Selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20591562 The Impact of System Cascading Collapse and Transmission Line Outages to the Transfer Capability Assessment
Authors: N. A. Salim, M. M. Othman, I. Musirin, M. S. Serwan
Abstract:
Uncertainty of system operating conditions is one of the causative reasons which may render to the instability of a transmission system. For that reason, accurate assessment of transmission reliability margin (TRM) is essential to ensure effective power transfer between areas during the occurrence of system uncertainties. The power transfer is also called as the available transfer capability (ATC) which is the information required by the utilities and marketers to instigate selling and buying the electric energy. This paper proposes a computationally effective approach to estimate TRM and ATC by considering the uncertainties of system cascading collapse and transmission line outages. In accordance to the results that have been obtained, the proposed method is essential for the transmission providers which could help the power marketers and planning sectors in the operation and reserving transmission services based on the ATC calculated.
Keywords: Available transfer capability, System cascading collapse, Transmission line outages, Transmission reliability margin.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20531561 Social Media as a Distribution Channel for Thailand’s Rice Berry Product
Authors: Phutthiwat Waiyawuththanapoom, Wannapong Waiyawuththanapoom, Pimploi Tirastittam
Abstract:
Nowadays, it is a globalization era which social media plays an important role to the lifestyle as an information source, tools to connect people together and etc. This research is object to find out about the significant level of the social media as a distribution channel to the agriculture product of Thailand. In this research, the agriculture product is the Rice Berry which is the cross-bred unmilled rice producing dark violet grain, is a combination of Hom Nin Rice and Thai Jasmine/ Fragrant Rice 105. Rice Berry has a very high nutrition and nice aroma so the product is in the growth stage of the product cycle. The problem for the Rice Berry product in Thailand is the production and the distribution channel. This study is to confirm that the social media is another option as the distribution channel for the product which is not a mass production product. This will be the role model for the other niche market product to select the distribution channel.
Keywords: Distribution, Social Media, Rice Berry, Distribution Channel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32551560 Enhancement of Low Contrast Satellite Images using Discrete Cosine Transform and Singular Value Decomposition
Authors: A. K. Bhandari, A. Kumar, P. K. Padhy
Abstract:
In this paper, a novel contrast enhancement technique for contrast enhancement of a low-contrast satellite image has been proposed based on the singular value decomposition (SVD) and discrete cosine transform (DCT). The singular value matrix represents the intensity information of the given image and any change on the singular values change the intensity of the input image. The proposed technique converts the image into the SVD-DCT domain and after normalizing the singular value matrix; the enhanced image is reconstructed by using inverse DCT. The visual and quantitative results suggest that the proposed SVD-DCT method clearly shows the increased efficiency and flexibility of the proposed method over the exiting methods such as Linear Contrast Stretching technique, GHE technique, DWT-SVD technique, DWT technique, Decorrelation Stretching technique, Gamma Correction method based techniques.Keywords: Singular Value Decomposition (SVD), discretecosine transforms (DCT), image equalization and satellite imagecontrast enhancement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38381559 Factors Influencing Household Expenditure Patterns on Cereal Grains in Nasarawa State, Nigeria
Authors: E. A. Ojoko, G. B. Umbugadu
Abstract:
This study aims at describing the expenditure pattern of households on millet, maize and sorghum across income groups in Nasarawa State. A multi-stage sampling technique was used to select a sample size of 316 respondents for the study. The Almost Ideal Demand System (AIDS) model was adopted in this study. Results from the study shows that the average household size was five persons with dependency ratio of 52 %, which plays an important role on the household’s expenditure pattern by increasing the household budget share. On the average 82 % were male headed households with an average age of 49 years and 13 years of formal education. Results on expenditure share show that maize has the highest expenditure share of 38 % across the three income groups and that most of the price effects are significantly different from zero at 5 % significant level. This shows that the low price of maize increased its demand as compared to other cereals. Household size and age of household members are major factors affecting the demand for cereals in the study. This agrees with the fact that increased household population (size) will bring about increase consumption. The results on factors influencing preferences for cereal grains reveals that cooking quality and appearance (65.7 %) were the most important factors affecting the demand for maize in the study area. This study recommends that cereal crop production should be prioritized in government policies and farming activities that help to boost food security and alleviate poverty should be subsidized.
Keywords: Expenditure pattern, AIDS model, budget share, price cereal grains and consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23391558 A New Protocol for Concealed Data Aggregation in Wireless Sensor Networks
Authors: M. Abbasi Dezfouli, S. Mazraeh, M. H. Yektaie
Abstract:
Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.Keywords: Wireless Sensor Networks, Security, Concealed Data Aggregation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1735