Search results for: communication model
6973 A Strategy for a Robust Design of Cracked Stiffened Panels
Authors: Francesco Caputo, Giuseppe Lamanna, Alessandro Soprano
Abstract:
This work is focused on the numerical prediction of the fracture resistance of a flat stiffened panel made of the aluminium alloy 2024 T3 under a monotonic traction condition. The performed numerical simulations have been based on the micromechanical Gurson-Tvergaard (GT) model for ductile damage. The applicability of the GT model to this kind of structural problems has been studied and assessed by comparing numerical results, obtained by using the WARP 3D finite element code, with experimental data available in literature. In the sequel a home-made procedure is presented, which aims to increase the residual strength of a cracked stiffened aluminum panel and which is based on the stochastic design improvement (SDI) technique; a whole application example is then given to illustrate the said technique.
Keywords: Residual strength, R-Curve, Gurson model, SDI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15416972 A Procedure to Assess Streamflow Rating Curves and Streamflow Sequences
Authors: Elena Carcano, Mirzi Betasolo
Abstract:
This study aims to provide sub-hourly streamflow predictions and associated rating curves for small catchments of intermittent and torrential flow regime characterized by flash floods occurring especially during April and November. The methodology entails two lumped conceptual hydrological models which work in series. The total model is based upon eleven parameters and shows good flexibility in handling different input sets. Runoff Coefficient has contributed to improving the model’s performances and has been treated as an additional parameter; while Sensitivity Analysis has highlighted how slight changes in the model’s input can lead to changes in model’s output. The adopted procedure is steady and useful to give very practical engineering information at the expense of a parsimonious request both in input data and in the number of adopted parameters. According to the obtained results, the authors encourage the test of this combined procedure on different hydrological scenarios in order to provide information for poorly monitored catchments and not updated sites.
Keywords: Streamflow rating curve, chronological data, streamflow sequences, conceptual models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4206971 Optical and Double Folding Model Analysis for Alpha Particles Elastically Scattered from 9Be and 11B Nuclei at Different Energies
Authors: Ahmed H. Amer, A. Amar, Sh. Hamada, I. I. Bondouk, F. A. El-Hussiny
Abstract:
Elastic scattering of α-particles from 9Be and 11B nuclei at different alpha energies have been analyzed. Optical model parameters (OMPs) of α-particles elastic scattering by these nuclei at different energies have been obtained. In the present calculations, the real part of the optical potential are derived by folding of nucleonnucleon (NN) interaction into nuclear matter density distribution of the projectile and target nuclei using computer code FRESCO. A density-dependent version of the M3Y interaction (CDM3Y6), which is based on the G-matrix elements of the Paris NN potential, has been used. Volumetric integrals of the real and imaginary potential depth (JR, JW) have been calculated and found to be energy dependent. Good agreement between the experimental data and the theoretical predictions in the whole angular range. In double folding (DF) calculations, the obtained normalization coefficient Nr is in the range 0.70–1.32.Keywords: Elastic scattering of α-particles, optical model parameters, double folding model, nucleon-nucleon interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21956970 Numerical Simulations on Feasibility of Stochastic Model Predictive Control for Linear Discrete-Time Systems with Random Dither Quantization
Authors: Taiki Baba, Tomoaki Hashimoto
Abstract:
The random dither quantization method enables us to achieve much better performance than the simple uniform quantization method for the design of quantized control systems. Motivated by this fact, the stochastic model predictive control method in which a performance index is minimized subject to probabilistic constraints imposed on the state variables of systems has been proposed for linear feedback control systems with random dither quantization. In other words, a method for solving optimal control problems subject to probabilistic state constraints for linear discrete-time control systems with random dither quantization has been already established. To our best knowledge, however, the feasibility of such a kind of optimal control problems has not yet been studied. Our objective in this paper is to investigate the feasibility of stochastic model predictive control problems for linear discrete-time control systems with random dither quantization. To this end, we provide the results of numerical simulations that verify the feasibility of stochastic model predictive control problems for linear discrete-time control systems with random dither quantization.Keywords: Model predictive control, stochastic systems, probabilistic constraints, random dither quantization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10206969 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models
Authors: Danielle Shackley, Yetunde Folajimi
Abstract:
As more people turn to the internet seeking health related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores of text, ranging from positive, neutral and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing, tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process, and substituting the Naive Bayes for a deep learning neural network model.
Keywords: Sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4876968 Secure Socket Layer in the Network and Web Security
Authors: Roza Dastres, Mohsen Soori
Abstract:
In order to electronically exchange information between network users in the web of data, different software such as outlook is presented. So, the traffic of users on a site or even the floors of a building can be decreased as a result of applying a secure and reliable data sharing software. It is essential to provide a fast, secure and reliable network system in the data sharing webs to create an advanced communication systems in the users of network. In the present research work, different encoding methods and algorithms in data sharing systems is studied in order to increase security of data sharing systems by preventing the access of hackers to the transferred data. To increase security in the networks, the possibility of textual conversation between customers of a local network is studied. Application of the encryption and decryption algorithms is studied in order to increase security in networks by preventing hackers from infiltrating. As a result, a reliable and secure communication system between members of a network can be provided by preventing additional traffic in the website environment in order to increase speed, accuracy and security in the network and web systems of data sharing.
Keywords: Secure Socket Layer, Security of networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5106967 Decision Trees for Predicting Risk of Mortality using Routinely Collected Data
Authors: Tessy Badriyah, Jim S. Briggs, Dave R. Prytherch
Abstract:
It is well known that Logistic Regression is the gold standard method for predicting clinical outcome, especially predicting risk of mortality. In this paper, the Decision Tree method has been proposed to solve specific problems that commonly use Logistic Regression as a solution. The Biochemistry and Haematology Outcome Model (BHOM) dataset obtained from Portsmouth NHS Hospital from 1 January to 31 December 2001 was divided into four subsets. One subset of training data was used to generate a model, and the model obtained was then applied to three testing datasets. The performance of each model from both methods was then compared using calibration (the χ2 test or chi-test) and discrimination (area under ROC curve or c-index). The experiment presented that both methods have reasonable results in the case of the c-index. However, in some cases the calibration value (χ2) obtained quite a high result. After conducting experiments and investigating the advantages and disadvantages of each method, we can conclude that Decision Trees can be seen as a worthy alternative to Logistic Regression in the area of Data Mining.Keywords: Decision Trees, Logistic Regression, clinical outcome, risk of mortality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25236966 System Identification Based on Stepwise Regression for Dynamic Market Representation
Authors: Alexander Efremov
Abstract:
A system for market identification (SMI) is presented. The resulting representations are multivariable dynamic demand models. The market specifics are analyzed. Appropriate models and identification techniques are chosen. Multivariate static and dynamic models are used to represent the market behavior. The steps of the first stage of SMI, named data preprocessing, are mentioned. Next, the second stage, which is the model estimation, is considered in more details. Stepwise linear regression (SWR) is used to determine the significant cross-effects and the orders of the model polynomials. The estimates of the model parameters are obtained by a numerically stable estimator. Real market data is used to analyze SMI performance. The main conclusion is related to the applicability of multivariate dynamic models for representation of market systems.Keywords: market identification, dynamic models, stepwise regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16186965 Fuzzy Trust for Peer-to-Peer Based Systems
Authors: Farag Azzedin, Ahmad Ridha, Ali Rizvi
Abstract:
Trust management is one of the drawbacks in Peer-to-Peer (P2P) system. Lack of centralized control makes it difficult to control the behavior of the peers. Reputation system is one approach to provide trust assessment in P2P system. In this paper, we use fuzzy logic to model trust in a P2P environment. Our trust model combines first-hand (direct experience) and second-hand (reputation)information to allow peers to represent and reason with uncertainty regarding other peers' trustworthiness. Fuzzy logic can help in handling the imprecise nature and uncertainty of trust. Linguistic labels are used to enable peers assign a trust level intuitively. Our fuzzy trust model is flexible such that inference rules are used to weight first-hand and second-hand accordingly.
Keywords: P2P Systems; Trust, Reputation, Fuzzy Logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21586964 Structural Parsing of Natural Language Text in Tamil Using Phrase Structure Hybrid Language Model
Authors: Selvam M, Natarajan. A M, Thangarajan R
Abstract:
Parsing is important in Linguistics and Natural Language Processing to understand the syntax and semantics of a natural language grammar. Parsing natural language text is challenging because of the problems like ambiguity and inefficiency. Also the interpretation of natural language text depends on context based techniques. A probabilistic component is essential to resolve ambiguity in both syntax and semantics thereby increasing accuracy and efficiency of the parser. Tamil language has some inherent features which are more challenging. In order to obtain the solutions, lexicalized and statistical approach is to be applied in the parsing with the aid of a language model. Statistical models mainly focus on semantics of the language which are suitable for large vocabulary tasks where as structural methods focus on syntax which models small vocabulary tasks. A statistical language model based on Trigram for Tamil language with medium vocabulary of 5000 words has been built. Though statistical parsing gives better performance through tri-gram probabilities and large vocabulary size, it has some disadvantages like focus on semantics rather than syntax, lack of support in free ordering of words and long term relationship. To overcome the disadvantages a structural component is to be incorporated in statistical language models which leads to the implementation of hybrid language models. This paper has attempted to build phrase structured hybrid language model which resolves above mentioned disadvantages. In the development of hybrid language model, new part of speech tag set for Tamil language has been developed with more than 500 tags which have the wider coverage. A phrase structured Treebank has been developed with 326 Tamil sentences which covers more than 5000 words. A hybrid language model has been trained with the phrase structured Treebank using immediate head parsing technique. Lexicalized and statistical parser which employs this hybrid language model and immediate head parsing technique gives better results than pure grammar and trigram based model.Keywords: Hybrid Language Model, Immediate Head Parsing, Lexicalized and Statistical Parsing, Natural Language Processing, Parts of Speech, Probabilistic Context Free Grammar, Tamil Language, Tree Bank.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36436963 Torrefaction of Biomass Pellets: Modeling of the Process in a Fixed Bed Reactor
Authors: Ekaterina Artiukhina, Panagiotis Grammelis
Abstract:
Torrefaction of biomass pellets is considered as a useful pretreatment technology in order to convert them into a high quality solid biofuel that is more suitable for pyrolysis, gasification, combustion, and co-firing applications. In the course of torrefaction, the temperature varies across the pellet, and therefore chemical reactions proceed unevenly within the pellet. However, the uniformity of the thermal distribution along the pellet is generally assumed. The torrefaction process of a single cylindrical pellet is modeled here, accounting for heat transfer coupled with chemical kinetics. The drying sub-model was also introduced. The nonstationary process of wood pellet decomposition is described by the system of non-linear partial differential equations over the temperature and mass. The model captures well the main features of the experimental data.
Keywords: Torrefaction, biomass pellets, model, heat and mass transfer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18026962 A Convolutional Deep Neural Network Approach for Skin Cancer Detection Using Skin Lesion Images
Authors: Firas Gerges, Frank Y. Shih
Abstract:
Malignant Melanoma, known simply as Melanoma, is a type of skin cancer that appears as a mole on the skin. It is critical to detect this cancer at an early stage because it can spread across the body and may lead to the patient death. When detected early, Melanoma is curable. In this paper we propose a deep learning model (Convolutional Neural Networks) in order to automatically classify skin lesion images as Malignant or Benign. Images underwent certain pre-processing steps to diminish the effect of the normal skin region on the model. The result of the proposed model showed a significant improvement over previous work, achieving an accuracy of 97%.
Keywords: Deep learning, skin cancer, image processing, melanoma.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15406961 Species Spreading due to Environmental Hostility, Dispersal Adaptation and Allee Effects
Authors: Sanjeeva Balasuriya
Abstract:
A phenomenological model for species spreading which incorporates the Allee effect, a species- maximum attainable growth rate, collective dispersal rate and dispersal adaptability is presented. This builds on a well-established reaction-diffusion model for spatial spreading of invading organisms. The model is phrased in terms of the “hostility" (which quantifies the Allee threshold in relation to environmental sustainability) and dispersal adaptability (which measures how a species is able to adapt its migratory response to environmental conditions). The species- invading/retreating speed and the sharpness of the invading boundary are explicitly characterised in terms of the fundamental parameters, and analysed in detail.
Keywords: Allee effect, dispersal, migration speed, diffusion, invasion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12646960 Performance Determinants for Convenience Store Suppliers
Authors: Zainah Abdullah, Aznur Hajar Abdullah
Abstract:
This paper examines the impact of information and communication technology (ICT) usage, internal relationship, supplier-retailer relationship, logistics services and inventory management on convenience store suppliers- performance. Data was collected from 275 convenience store managers in Malaysia using a set of questionnaire. The multiple linear regression results indicate that inventory management, supplier-retailer relationship, logistics services and internal relationship are predictors of supplier performance as perceived by convenience store managers. However, ICT usage is not a predictor of supplier performance. The study focuses only on convenience stores and petrol station convenience stores and concentrates only on managers. The results provide insights to suppliers who serve convenience stores and possibly similar retail format on factors to consider in improving their service to retailers. The results also provide insights to government in its aspiration to improve business operations of convenience store to consider ways to enhance the adoption of ICT by retailers and suppliers.Keywords: Information and communication technology (ICT), internal relationship, inventory management, logistics services, supplier performance, supplier-retailer relationship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40086959 Modeling of a Novel Dual-Belt Continuously Variable Transmission for Automobiles
Authors: Y. Q. Chen, P. K. Wong, Z. C. Xie, H. W. Wu, K. U. Chan, J., L. Huang
Abstract:
It is believed that continuously variable transmission (CVT) will dominate the automotive transmissions in the future. The most popular design is Van Doorne-s CVT with single metal pushing V-belt. However, it is only applicable to low power passenger cars because its major limitation is low torque capacity. Therefore, this research studies a novel dual-belt CVT system to overcome the limitation of traditional single-belt CVT, such that it can be applicable to the heavy-duty vehicles. This paper presents the mathematical model of the design and its experimental verification. Experimental and simulated results show that the model developed is valid and the proposed dual-belt CVT can really overcome the traditional limitation of single-belt Van Doorne-s CVT.
Keywords: Analytical model, CVT, Dual belts, Torque capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21466958 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook
Authors: Chien-Jen Liu, Shu Ching Yang
Abstract:
Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.
Keywords: Technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31956957 Intelligent Modeling of the Electrical Activity of the Human Heart
Authors: Lambros V. Skarlas, Grigorios N. Beligiannis, Efstratios F. Georgopoulos, Adam V. Adamopoulos
Abstract:
The aim of this contribution is to present a new approach in modeling the electrical activity of the human heart. A recurrent artificial neural network is being used in order to exhibit a subset of the dynamics of the electrical behavior of the human heart. The proposed model can also be used, when integrated, as a diagnostic tool of the human heart system. What makes this approach unique is the fact that every model is being developed from physiological measurements of an individual. This kind of approach is very difficult to apply successfully in many modeling problems, because of the complexity and entropy of the free variables describing the complex system. Differences between the modeled variables and the variables of an individual, measured at specific moments, can be used for diagnostic purposes. The sensor fusion used in order to optimize the utilization of biomedical sensors is another point that this paper focuses on. Sensor fusion has been known for its advantages in applications such as control and diagnostics of mechanical and chemical processes.Keywords: Artificial Neural Networks, Diagnostic System, Health Condition Modeling Tool, Heart Diagnostics Model, Heart Electricity Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18276956 Edge Segmentation of Satellite Image using Phase Congruency Model
Authors: Ahmed Zaafouri, Mounir Sayadi, Farhat Fnaiech
Abstract:
In this paper, we present a method for edge segmentation of satellite images based on 2-D Phase Congruency (PC) model. The proposed approach is composed by two steps: The contextual non linear smoothing algorithm (CNLS) is used to smooth the input images. Then, the 2D stretched Gabor filter (S-G filter) based on proposed angular variation is developed in order to avoid the multiple responses in the previous work. An assessment of our proposed method performance is provided in terms of accuracy of satellite image edge segmentation. The proposed method is compared with others known approaches.Keywords: Edge segmentation, Phase congruency model, Satellite images, Stretched Gabor filter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26676955 Simulation of a Multi-Component Transport Model for the Chemical Reaction of a CVD-Process
Abstract:
In this paper we present discretization and decomposition methods for a multi-component transport model of a chemical vapor deposition (CVD) process. CVD processes are used to manufacture deposition layers or bulk materials. In our transport model we simulate the deposition of thin layers. The microscopic model is based on the heavy particles, which are derived by approximately solving a linearized multicomponent Boltzmann equation. For the drift-process of the particles we propose diffusionreaction equations as well as for the effects of heat conduction. We concentrate on solving the diffusion-reaction equation with analytical and numerical methods. For the chemical processes, modelled with reaction equations, we propose decomposition methods and decouple the multi-component models to simpler systems of differential equations. In the numerical experiments we present the computational results of our proposed models.
Keywords: Chemical reactions, chemical vapor deposition, convection-diffusion-reaction equations, decomposition methods, multi-component transport.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14106954 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.
Keywords: Anomaly detection, autoencoder, data centers, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7426953 Jeffrey's Prior for Unknown Sinusoidal Noise Model via Cramer-Rao Lower Bound
Authors: Samuel A. Phillips, Emmanuel A. Ayanlowo, Rasaki O. Olanrewaju, Olayode Fatoki
Abstract:
This paper employs the Jeffrey's prior technique in the process of estimating the periodograms and frequency of sinusoidal model for unknown noisy time variants or oscillating events (data) in a Bayesian setting. The non-informative Jeffrey's prior was adopted for the posterior trigonometric function of the sinusoidal model such that Cramer-Rao Lower Bound (CRLB) inference was used in carving-out the minimum variance needed to curb the invariance structure effect for unknown noisy time observational and repeated circular patterns. An average monthly oscillating temperature series measured in degree Celsius (0C) from 1901 to 2014 was subjected to the posterior solution of the unknown noisy events of the sinusoidal model via Markov Chain Monte Carlo (MCMC). It was not only deduced that two minutes period is required before completing a cycle of changing temperature from one particular degree Celsius to another but also that the sinusoidal model via the CRLB-Jeffrey's prior for unknown noisy events produced a miniature posterior Maximum A Posteriori (MAP) compare to a known noisy events.
Keywords: Cramer-Rao Lower Bound (CRLB), Jeffrey's prior, Sinusoidal, Maximum A Posteriori (MAP), Markov Chain Monte Carlo (MCMC), Periodograms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6586952 Analysis of the Communication Methods of an iCIM 3000 System within the Frame of Research Purpose
Authors: Radovan Holubek, Daynier Rolando Delgado Sobrino, Roman Ruzarovsky
Abstract:
Current trends in manufacturing are characterized by production broadening, innovation cycle shortening, and the products having a new shape, material and functions. The production strategy focused on time needed change from the traditional functional production structure to flexible manufacturing cells and lines. Production by automated manufacturing system (AMS) is one of the most important manufacturing philosophies in the last years. The main goals of the project we are involved in lies on building a laboratory in which will be located a flexible manufacturing system consisting of at least two production machines with NC control (milling machines, lathe). These machines will be linked to a transport system and they will be served by industrial robots. Within this flexible manufacturing system a station for the quality control consisting of a camera system and rack warehouse will be also located. The design, analysis and improvement of this manufacturing system, specially with a special focus on the communication among devices constitute the main aims of this paper. The key determining factors for the manufacturing system design are: the product, the production volume, the used machines, the disposable manpower, the disposable infrastructure and the legislative frame for the specific cases.Keywords: Paperless manufacturing, flexible manufacturing, robotized manufacturing, material flow, iCIM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18046951 Bifurcation Analysis of a Delayed Predator-prey Fishery Model with Prey Reserve in Frequency Domain
Authors: Changjin Xu
Abstract:
In this paper, applying frequency domain approach, a delayed predator-prey fishery model with prey reserve is investigated. By choosing the delay τ as a bifurcation parameter, It is found that Hopf bifurcation occurs as the bifurcation parameter τ passes a sequence of critical values. That is, a family of periodic solutions bifurcate from the equilibrium when the bifurcation parameter exceeds a critical value. The length of delay which preserves the stability of the positive equilibrium is calculated. Some numerical simulations are included to justify the theoretical analysis results. Finally, main conclusions are given.
Keywords: Predator-prey model, stability, Hopf bifurcation, frequency domain, Nyquist criterion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14046950 Aeroelastic Response for Pure Plunging Motion of a Typical Section Due to Sharp Edged Gust, Using Jones Approximation Aerodynamics
Authors: M. H. Kargarnovin, A. Mamandi
Abstract:
This paper presents investigation effects of a sharp edged gust on aeroelastic behavior and time-domain response of a typical section model using Jones approximate aerodynamics for pure plunging motion. Flutter analysis has been done by using p and p-k methods developed for presented finite-state aerodynamic model for a typical section model (airfoil). Introduction of gust analysis as a linear set of ordinary differential equations in a simplified procedure has been carried out by using transformation into an eigenvalue problem.
Keywords: Aeroelastic response, jones approximation, pure plunging motion, sharp edged gust.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18856949 Image Transmission: A Case Study on Combined Scheme of LDPC-STBC in Asynchronous Cooperative MIMO Systems
Authors: Shan Ding, Lijia Zhang, Hongming Xu
Abstract:
this paper presents a novel scheme which is capable of reducing the error rate and improves the transmission performance in the asynchronous cooperative MIMO systems. A case study of image transmission is applied to prove the efficient of scheme. The linear dispersion structure is employed to accommodate the cooperative wireless communication network in the dynamic topology of structure, as well as to achieve higher throughput than conventional space–time codes based on orthogonal designs. The LDPC encoder without girth-4 and the STBC encoder with guard intervals are respectively introduced. The experiment results show that the combined coder of LDPC-STBC with guard intervals can be the good error correcting coders and BER performance in the asynchronous cooperative communication. In the case study of image transmission, the results show that in the transmission process, the image quality which is obtained by applied combined scheme is much better than it which is not applied the scheme in the asynchronous cooperative MIMO systems.
Keywords: Cooperative MIMO, image transmission, lineardispersion codes, Low-Density Parity-Check (LDPC)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19336948 Machine Scoring Model Using Data Mining Techniques
Authors: Wimalin S. Laosiritaworn, Pongsak Holimchayachotikul
Abstract:
this article proposed a methodology for computer numerical control (CNC) machine scoring. The case study company is a manufacturer of hard disk drive parts in Thailand. In this company, sample of parts manufactured from CNC machine are usually taken randomly for quality inspection. These inspection data were used to make a decision to shut down the machine if it has tendency to produce parts that are out of specification. Large amount of data are produced in this process and data mining could be very useful technique in analyzing them. In this research, data mining techniques were used to construct a machine scoring model called 'machine priority assessment model (MPAM)'. This model helps to ensure that the machine with higher risk of producing defective parts be inspected before those with lower risk. If the defective prone machine is identified sooner, defective part and rework could be reduced hence improving the overall productivity. The results showed that the proposed method can be successfully implemented and approximately 351,000 baht of opportunity cost could have saved in the case study company.Keywords: Computer Numerical Control, Data Mining, HardDisk Drive.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13956947 Using Jumping Particle Swarm Optimization for Optimal Operation of Pump in Water Distribution Networks
Authors: R. Rajabpour, N. Talebbeydokhti, M. H. Ahmadi
Abstract:
Carefully scheduling the operations of pumps can be resulted to significant energy savings. Schedules can be defined either implicit, in terms of other elements of the network such as tank levels, or explicit by specifying the time during which each pump is on/off. In this study, two new explicit representations based on timecontrolled triggers were analyzed, where the maximum number of pump switches was established beforehand, and the schedule may contain fewer switches than the maximum. The optimal operation of pumping stations was determined using a Jumping Particle Swarm Optimization (JPSO) algorithm to achieve the minimum energy cost. The model integrates JPSO optimizer and EPANET hydraulic network solver. The optimal pump operation schedule of VanZyl water distribution system was determined using the proposed model and compared with those from Genetic and Ant Colony algorithms. The results indicate that the proposed model utilizing the JPSO algorithm is a versatile management model for the operation of realworld water distribution system.Keywords: JPSO, operation, optimization, water distribution system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20516946 A Multi-Objective Model for Supply Chain Network Design under Stochastic Demand
Authors: F. Alborzi, H. Vafaei, M.H. Gholami, M.M. S. Esfahani
Abstract:
In this article, the design of a Supply Chain Network (SCN) consisting of several suppliers, production plants, distribution centers and retailers, is considered. Demands of retailers are considered stochastic parameters, so we generate amounts of data via simulation to extract a few demand scenarios. Then a mixed integer two-stage programming model is developed to optimize simultaneously two objectives: (1) minimization the fixed and variable cost, (2) maximization the service level. A weighting method is utilized to solve this two objective problem and a numerical example is made to show the performance of the model.Keywords: Mixed Integer Programming, Multi-objective Optimization, Stochastic Demand, Supply Chain Design, Two Stage Programming
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23216945 Image Adaptive Watermarking with Visual Model in Orthogonal Polynomials based Transformation Domain
Authors: Krishnamoorthi R., Sheba Kezia Malarchelvi P. D.
Abstract:
In this paper, an image adaptive, invisible digital watermarking algorithm with Orthogonal Polynomials based Transformation (OPT) is proposed, for copyright protection of digital images. The proposed algorithm utilizes a visual model to determine the watermarking strength necessary to invisibly embed the watermark in the mid frequency AC coefficients of the cover image, chosen with a secret key. The visual model is designed to generate a Just Noticeable Distortion mask (JND) by analyzing the low level image characteristics such as textures, edges and luminance of the cover image in the orthogonal polynomials based transformation domain. Since the secret key is required for both embedding and extraction of watermark, it is not possible for an unauthorized user to extract the embedded watermark. The proposed scheme is robust to common image processing distortions like filtering, JPEG compression and additive noise. Experimental results show that the quality of OPT domain watermarked images is better than its DCT counterpart.Keywords: Orthogonal Polynomials based Transformation, Digital Watermarking, Copyright Protection, Visual model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16966944 Anonymous Editing Prevention Technique Using Gradient Method for High-Quality Video
Authors: Jiwon Lee, Chanho Jung, Si-Hwan Jang, Kyung-Ill Kim, Sanghyun Joo, Wook-Ho Son
Abstract:
Since the advances in digital imaging technologies have led to development of high quality digital devices, there are a lot of illegal copies of copyrighted video content on the Internet. Also, unauthorized editing is occurred frequently. Thus, we propose an editing prevention technique for high-quality (HQ) video that can prevent these illegally edited copies from spreading out. The proposed technique is applied spatial and temporal gradient methods to improve the fidelity and detection performance. Also, the scheme duplicates the embedding signal temporally to alleviate the signal reduction caused by geometric and signal-processing distortions. Experimental results show that the proposed scheme achieves better performance than previously proposed schemes and it has high fidelity. The proposed scheme can be used in unauthorized access prevention method of visual communication or traitor tracking applications which need fast detection process to prevent illegally edited video content from spreading out.Keywords: Editing prevention technique, gradient method, high-quality video, luminance change, visual communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930