Search results for: fuzzy intelligent technique.
2132 An Active Set Method in Image Inpainting
Authors: Marrick Neri, Esmeraldo Ronnie Rey Zara
Abstract:
In this paper, we apply a semismooth active set method to image inpainting. The method exploits primal and dual features of a proposed regularized total variation model, following after the technique presented in [4]. Numerical results show that the method is fast and efficient in inpainting sufficiently thin domains.
Keywords: Active set method, image inpainting, total variation model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18162131 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia
Authors: Carol Anne Hargreaves
Abstract:
A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.
Keywords: Machine learning, stock market trading, logistic principal component analysis, automated stock investment system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10992130 Optical Reflectance of Pure and Doped Tin Oxide: From Thin Films to Poly-Crystalline Silicon/Thin Film Device
Authors: Smaali Assia, Outemzabet Ratiba, Media El Mahdi, Kadi Mohamed
Abstract:
Films of pure tin oxide SnO2 and in presence of antimony atoms (SnO2-Sb) deposited onto glass substrates have shown a sufficiently high energy gap to be transparent in the visible region, a high electrical mobility and a carrier concentration which displays a good electrical conductivity [1]. In this work, the effects of polycrystalline silicon substrate on the optical properties of pure and Sb doped tin oxide is investigated. We used the APCVD (atmospheric pressure chemical vapour deposition) technique, which is a low-cost and simple technique, under nitrogen ambient, for growing this material. A series of SnO2 and SnO2-Sb have been deposited onto polycrystalline silicon substrates with different contents of antimony atoms at the same conditions of deposition (substrate temperature, flow oxygen, duration and nitrogen atmosphere of the reactor). The effect of the substrate in terms of morphology and nonlinear optical properties, mainly the reflectance, was studied. The reflectance intensity of the device, compared to the reflectance of tin oxide films deposited directly on glass substrate, is clearly reduced on the overall wavelength range. It is obvious that the roughness of the poly-c silicon plays an important role by improving the reflectance and hence the optical parameters. A clear shift in the minimum of the reflectance upon doping level is observed. This minimum corresponds to strong free carrier absorption, resulting in different plasma frequency. This effect is followed by an increase in the reflectance depending of the antimony doping. Applying the extended Drude theory to the combining optical and electrical obtained results these effects are discussed.Keywords: Doping, oxide, reflectance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29152129 A Nodal Transmission Pricing Model based on Newly Developed Expressions of Real and Reactive Power Marginal Prices in Competitive Electricity Markets
Authors: Ashish Saini, A.K. Saxena
Abstract:
In competitive electricity markets all over the world, an adoption of suitable transmission pricing model is a problem as transmission segment still operates as a monopoly. Transmission pricing is an important tool to promote investment for various transmission services in order to provide economic, secure and reliable electricity to bulk and retail customers. The nodal pricing based on SRMC (Short Run Marginal Cost) is found extremely useful by researchers for sending correct economic signals. The marginal prices must be determined as a part of solution to optimization problem i.e. to maximize the social welfare. The need to maximize the social welfare subject to number of system operational constraints is a major challenge from computation and societal point of views. The purpose of this paper is to present a nodal transmission pricing model based on SRMC by developing new mathematical expressions of real and reactive power marginal prices using GA-Fuzzy based optimal power flow framework. The impacts of selecting different social welfare functions on power marginal prices are analyzed and verified with results reported in literature. Network revenues for two different power systems are determined using expressions derived for real and reactive power marginal prices in this paper.
Keywords: Deregulation, electricity markets, nodal pricing, social welfare function, short run marginal cost.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16452128 Computing SAGB-Gröbner Basis of Ideals of Invariant Rings by Using Gaussian Elimination
Authors: Sajjad Rahmany, Abdolali Basiri
Abstract:
The link between Gröbner basis and linear algebra was described by Lazard [4,5] where he realized the Gr┬¿obner basis computation could be archived by applying Gaussian elimination over Macaulay-s matrix . In this paper, we indicate how same technique may be used to SAGBI- Gröbner basis computations in invariant rings.Keywords: Gröbner basis, SAGBI- Gröbner basis, reduction, Invariant ring, permutation groups.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30022127 Distributed Manufacturing (DM) - Smart Units and Collaborative Processes
Authors: Hermann Kuehnle
Abstract:
Applications of the Hausdorff space and its mappings into tangent spaces are outlined, including their fractal dimensions and self-similarities. The paper details this theory set up and further describes virtualizations and atomization of manufacturing processes. It demonstrates novel concurrency principles that will guide manufacturing processes and resources configurations. Moreover, varying levels of details may be produced by up folding and breaking down of newly introduced generic models. This choice of layered generic models for units and systems aspects along specific aspects allows research work in parallel to other disciplines with the same focus on all levels of detail. More credit and easier access are granted to outside disciplines for enriching manufacturing grounds. Specific mappings and the layers give hints for chances for interdisciplinary outcomes and may highlight more details for interoperability standards, as already worked on the international level. The new rules are described, which require additional properties concerning all involved entities for defining distributed decision cycles, again on the base of self-similarity. All properties are further detailed and assigned to a maturity scale, eventually displaying the smartness maturity of a total shopfloor or a factory. The paper contributes to the intensive ongoing discussion in the field of intelligent distributed manufacturing and promotes solid concepts for implementations of Cyber Physical Systems and the Internet of Things into manufacturing industry, like industry 4.0, as discussed in German-speaking countries.
Keywords: Autonomous unit, Networkability, Smart manufacturing unit, Virtualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20742126 Derivation of Monotone Likelihood Ratio Using Two Sided Uniformly Normal Distribution Techniques
Authors: D. A. Farinde
Abstract:
In this paper, two-sided uniformly normal distribution techniques were used in the derivation of monotone likelihood ratio. The approach mainly employed the parameters of the distribution for a class of all size a. The derivation technique is fast, direct and less burdensome when compared to some existing methods.
Keywords: Neyman-Pearson Lemma, Normal distribution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32032125 Layer-by-Layer Deposition of Poly (Ethylene Imine) Nanolayers on Polypropylene Nonwoven Fabric. Electrostatic and Thermal Properties
Authors: Dawid Stawski, Silviya Halacheva, Dorota Zielińska
Abstract:
The surface properties of many materials can be readily and predictably modified by the controlled deposition of thin layers containing appropriate functional groups and this research area is now a subject of widespread interest. The layer-by-layer (lbl) method involves depositing oppositely charged layers of polyelectrolytes onto the substrate material which are stabilized due to strong electrostatic forces between adjacent layers. This type of modification affords products that combine the properties of the original material with the superficial parameters of the new external layers. Through an appropriate selection of the deposited layers, the surface properties can be precisely controlled and readily adjusted in order to meet the requirements of the intended application. In the presented paper a variety of anionic (poly(acrylic acid)) and cationic (linear poly(ethylene imine), polymers were successfully deposited onto the polypropylene nonwoven using the lbl technique. The chemical structure of the surface before and after modification was confirmed by reflectance FTIR spectroscopy, volumetric analysis and selective dyeing tests. As a direct result of this work, new materials with greatly improved properties have been produced. For example, following a modification process significant changes in the electrostatic activity of a range of novel nanocomposite materials were observed. The deposition of polyelectrolyte nanolayers was found to strongly accelerate the loss of electrostatically generated charges and to increase considerably the thermal resistance properties of the modified fabric (the difference in T50% is over 20oC). From our results, a clear relationship between the type of polyelectrolyte layer deposited onto the flat fabric surface and the properties of the modified fabric was identified.
Keywords: Layer-by-layer technique, polypropylene nonwoven, surface modification, surface properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25062124 Comparison of Different Hydrograph Routing Techniques in XPSTORM Modelling Software: A Case Study
Authors: Fatema Akram, Mohammad Golam Rasul, Mohammad Masud Kamal Khan, Md. Sharif Imam Ibne Amir
Abstract:
A variety of routing techniques are available to develop surface runoff hydrographs from rainfall. The selection of runoff routing method is very vital as it is directly related to the type of watershed and the required degree of accuracy. There are different modelling softwares available to explore the rainfall-runoff process in urban areas. XPSTORM, a link-node based, integrated stormwater modelling software, has been used in this study for developing surface runoff hydrograph for a Golf course area located in Rockhampton in Central Queensland in Australia. Four commonly used methods, namely SWMM runoff, Kinematic wave, Laurenson, and Time-Area are employed to generate runoff hydrograph for design storm of this study area. In runoff mode of XPSTORM, the rainfall, infiltration, evaporation and depression storage for subcatchments were simulated and the runoff from the subcatchment to collection node was calculated. The simulation results are presented, discussed and compared. The total surface runoff generated by SWMM runoff, Kinematic wave and Time-Area methods are found to be reasonably close, which indicates any of these methods can be used for developing runoff hydrograph of the study area. Laurenson method produces a comparatively less amount of surface runoff, however, it creates highest peak of surface runoff among all which may be suitable for hilly region. Although the Laurenson hydrograph technique is widely acceptable surface runoff routing technique in Queensland (Australia), extensive investigation is recommended with detailed topographic and hydrologic data in order to assess its suitability for use in the case study area.
Keywords: ARI, design storm, IFD, rainfall temporal pattern, routing techniques, surface runoff, XPSTORM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50472123 CBIR Using Multi-Resolution Transform for Brain Tumour Detection and Stages Identification
Authors: H. Benjamin Fredrick David, R. Balasubramanian, A. Anbarasa Pandian
Abstract:
Image retrieval is the most interesting technique which is being used today in our digital world. CBIR, commonly expanded as Content Based Image Retrieval is an image processing technique which identifies the relevant images and retrieves them based on the patterns that are extracted from the digital images. In this paper, two research works have been presented using CBIR. The first work provides an automated and interactive approach to the analysis of CBIR techniques. CBIR works on the principle of supervised machine learning which involves feature selection followed by training and testing phase applied on a classifier in order to perform prediction. By using feature extraction, the image transforms such as Contourlet, Ridgelet and Shearlet could be utilized to retrieve the texture features from the images. The features extracted are used to train and build a classifier using the classification algorithms such as Naïve Bayes, K-Nearest Neighbour and Multi-class Support Vector Machine. Further the testing phase involves prediction which predicts the new input image using the trained classifier and label them from one of the four classes namely 1- Normal brain, 2- Benign tumour, 3- Malignant tumour and 4- Severe tumour. The second research work includes developing a tool which is used for tumour stage identification using the best feature extraction and classifier identified from the first work. Finally, the tool will be used to predict tumour stage and provide suggestions based on the stage of tumour identified by the system. This paper presents these two approaches which is a contribution to the medical field for giving better retrieval performance and for tumour stages identification.
Keywords: Brain tumour detection, content based image retrieval, classification of tumours, image retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7752122 Recycling of Tungsten Alloy Swarf
Authors: A. A. Alhazza
Abstract:
The recycling process of Tungsten alloy (Swarf) by oxidation reduction technique have been investigated. The reduced powder was pressed under a pressure 20Kg/cm2 and sintered at 1150°C in dry hydrogen atmosphere. The particle size of the recycled alloy powder was 1-3 μm and the shape was regular at a reduction temperature 800°C. The chemical composition of the recycled alloy is the same as the primary Swarf.Keywords: Recycling, Swarf, Oxidation, Reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19232121 Computational Feasibility Study of a Torsional Wave Transducer for Tissue Stiffness Monitoring
Authors: Rafael Muñoz, Juan Melchor, Alicia Valera, Laura Peralta, Guillermo Rus
Abstract:
A torsional piezoelectric ultrasonic transducer design is proposed to measure shear moduli in soft tissue with direct access availability, using shear wave elastography technique. The measurement of shear moduli of tissues is a challenging problem, mainly derived from a) the difficulty of isolating a pure shear wave, given the interference of multiple waves of different types (P, S, even guided) emitted by the transducers and reflected in geometric boundaries, and b) the highly attenuating nature of soft tissular materials. An immediate application, overcoming these drawbacks, is the measurement of changes in cervix stiffness to estimate the gestational age at delivery. The design has been optimized using a finite element model (FEM) and a semi-analytical estimator of the probability of detection (POD) to determine a suitable geometry, materials and generated waves. The technique is based on the time of flight measurement between emitter and receiver, to infer shear wave velocity. Current research is centered in prototype testing and validation. The geometric optimization of the transducer was able to annihilate the compressional wave emission, generating a quite pure shear torsional wave. Currently, mechanical and electromagnetic coupling between emitter and receiver signals are being the research focus. Conclusions: the design overcomes the main described problems. The almost pure shear torsional wave along with the short time of flight avoids the possibility of multiple wave interference. This short propagation distance reduce the effect of attenuation, and allow the emission of very low energies assuring a good biological security for human use.Keywords: Cervix ripening, preterm birth, shear modulus, shear wave elastography, soft tissue, torsional wave.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15672120 Optimal Placement of Processors based on Effective Communication Load
Authors: A. R. Aswatha, T. Basavaraju, N. Bhaskara Rao
Abstract:
This paper presents a new technique for the optimum placement of processors to minimize the total effective communication load under multi-processor communication dominated environment. This is achieved by placing heavily loaded processors near each other and lightly loaded ones far away from one another in the physical grid locations. The results are mathematically proved for the Algorithms are described.Keywords: Ascending Sort Index Vector, EffectiveCommunication Load, Effective Distance Matrix, OptimalPlacement, Sorting Order.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13492119 Improving 99mTc-tetrofosmin Myocardial Perfusion Images by Time Subtraction Technique
Authors: Yasuyuki Takahashi, Hayato Ishimura, Masao Miyagawa, Teruhito Mochizuki
Abstract:
Quantitative measurement of myocardium perfusion is possible with single photon emission computed tomography (SPECT) using a semiconductor detector. However, accumulation of 99mTc-tetrofosmin in the liver may make it difficult to assess that accurately in the inferior myocardium. Our idea is to reduce the high accumulation in the liver by using dynamic SPECT imaging and a technique called time subtraction. We evaluated the performance of a new SPECT system with a cadmium-zinc-telluride solid-state semi- conductor detector (Discovery NM 530c; GE Healthcare). Our system acquired list-mode raw data over 10 minutes for a typical patient. From the data, ten SPECT images were reconstructed, one for every minute of acquired data. Reconstruction with the semiconductor detector was based on an implementation of a 3-D iterative Bayesian reconstruction algorithm. We studied 20 patients with coronary artery disease (mean age 75.4 ± 12.1 years; range 42-86; 16 males and 4 females). In each subject, 259 MBq of 99mTc-tetrofosmin was injected intravenously. We performed both a phantom and a clinical study using dynamic SPECT. An approximation to a liver-only image is obtained by reconstructing an image from the early projections during which time the liver accumulation dominates (0.5~2.5 minutes SPECT image-5~10 minutes SPECT image). The extracted liver-only image is then subtracted from a later SPECT image that shows both the liver and the myocardial uptake (5~10 minutes SPECT image-liver-only image). The time subtraction of liver was possible in both a phantom and the clinical study. The visualization of the inferior myocardium was improved. In past reports, higher accumulation in the myocardium due to the overlap of the liver is un-diagnosable. Using our time subtraction method, the image quality of the 99mTc-tetorofosmin myocardial SPECT image is considerably improved.
Keywords: 99mTc-tetrofosmin, dynamic SPECT, time subtraction, semiconductor detector.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10332118 Phosphine Mortality Estimation for Simulation of Controlling Pest of Stored Grain: Lesser Grain Borer (Rhyzopertha dominica)
Authors: Mingren Shi, Michael Renton
Abstract:
There is a world-wide need for the development of sustainable management strategies to control pest infestation and the development of phosphine (PH3) resistance in lesser grain borer (Rhyzopertha dominica). Computer simulation models can provide a relatively fast, safe and inexpensive way to weigh the merits of various management options. However, the usefulness of simulation models relies on the accurate estimation of important model parameters, such as mortality. Concentration and time of exposure are both important in determining mortality in response to a toxic agent. Recent research indicated the existence of two resistance phenotypes in R. dominica in Australia, weak and strong, and revealed that the presence of resistance alleles at two loci confers strong resistance, thus motivating the construction of a two-locus model of resistance. Experimental data sets on purified pest strains, each corresponding to a single genotype of our two-locus model, were also available. Hence it became possible to explicitly include mortalities of the different genotypes in the model. In this paper we described how we used two generalized linear models (GLM), probit and logistic models, to fit the available experimental data sets. We used a direct algebraic approach generalized inverse matrix technique, rather than the traditional maximum likelihood estimation, to estimate the model parameters. The results show that both probit and logistic models fit the data sets well but the former is much better in terms of small least squares (numerical) errors. Meanwhile, the generalized inverse matrix technique achieved similar accuracy results to those from the maximum likelihood estimation, but is less time consuming and computationally demanding.
Keywords: mortality estimation, probit models, logistic model, generalized inverse matrix approach, pest control simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15842117 Utilizing Ontologies Using Ontology Editor for Creating Initial Unified Modeling Language (UML)Object Model
Authors: Waralak Vongdoiwang Siricharoen
Abstract:
One of object oriented software developing problem is the difficulty of searching the appropriate and suitable objects for starting the system. In this work, ontologies appear in the part of supporting the object discovering in the initial of object oriented software developing. There are many researches try to demonstrate that there is a great potential between object model and ontologies. Constructing ontology from object model is called ontology engineering can be done; On the other hand, this research is aiming to support the idea of building object model from ontology is also promising and practical. Ontology classes are available online in any specific areas, which can be searched by semantic search engine. There are also many helping tools to do so; one of them which are used in this research is Protégé ontology editor and Visual Paradigm. To put them together give a great outcome. This research will be shown how it works efficiently with the real case study by using ontology classes in travel/tourism domain area. It needs to combine classes, properties, and relationships from more than two ontologies in order to generate the object model. In this paper presents a simple methodology framework which explains the process of discovering objects. The results show that this framework has great value while there is possible for expansion. Reusing of existing ontologies offers a much cheaper alternative than building new ones from scratch. More ontologies are becoming available on the web, and online ontologies libraries for storing and indexing ontologies are increasing in number and demand. Semantic and Ontologies search engines have also started to appear, to facilitate search and retrieval of online ontologies.Keywords: Software Developing, Ontology, Ontology Library, Artificial Intelligent, Protégé, Object Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18782116 Revealing Nonlinear Couplings between Oscillators from Time Series
Authors: B.P. Bezruchko, D.A. Smirnov
Abstract:
Quantitative characterization of nonlinear directional couplings between stochastic oscillators from data is considered. We suggest coupling characteristics readily interpreted from a physical viewpoint and their estimators. An expression for a statistical significance level is derived analytically that allows reliable coupling detection from a relatively short time series. Performance of the technique is demonstrated in numerical experiments.Keywords: Nonlinear time series analysis, directional couplings, coupled oscillators.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12652115 Simulation of Sample Paths of Non Gaussian Stationary Random Fields
Authors: Fabrice Poirion, Benedicte Puig
Abstract:
Mathematical justifications are given for a simulation technique of multivariate nonGaussian random processes and fields based on Rosenblatt-s transformation of Gaussian processes. Different types of convergences are given for the approaching sequence. Moreover an original numerical method is proposed in order to solve the functional equation yielding the underlying Gaussian process autocorrelation function.
Keywords: Simulation, nonGaussian, random field, multivariate, stochastic process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18402114 Selective Excitation of Circular Helical Modes in Graded Index Fibers
Authors: S. Al-Sowayan
Abstract:
The impact of selective excitation of circular helical modes of graded-index fibers on its capacity is analyzed using a model for propagation delay variation with launch offset and angle that resulted from misalignment of source and fiber axis. Results show promising technique to improve graded-index fiber capacities.
Keywords: Fiber measurements, Fiber optic communications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15862113 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools
Abstract:
Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.
Keywords: Block matching, digital evidence, hash list.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13582112 An Approach to Secure Mobile Agent Communication in Multi-Agent Systems
Authors: Olumide Simeon Ogunnusi, Shukor Abd Razak, Michael Kolade Adu
Abstract:
Inter-agent communication manager facilitates communication among mobile agents via message passing mechanism. Until now, all Foundation for Intelligent Physical Agents (FIPA) compliant agent systems are capable of exchanging messages following the standard format of sending and receiving messages. Previous works tend to secure messages to be exchanged among a community of collaborative agents commissioned to perform specific tasks using cryptosystems. However, the approach is characterized by computational complexity due to the encryption and decryption processes required at the two ends. The proposed approach to secure agent communication allows only agents that are created by the host agent server to communicate via the agent communication channel provided by the host agent platform. These agents are assumed to be harmless. Therefore, to secure communication of legitimate agents from intrusion by external agents, a 2-phase policy enforcement system was developed. The first phase constrains the external agent to run only on the network server while the second phase confines the activities of the external agent to its execution environment. To implement the proposed policy, a controller agent was charged with the task of screening any external agent entering the local area network and preventing it from migrating to the agent execution host where the legitimate agents are running. On arrival of the external agent at the host network server, an introspector agent was charged to monitor and restrain its activities. This approach secures legitimate agent communication from Man-in-the Middle and Replay attacks.
Keywords: Agent communication, introspective agent, isolation of agent, policy enforcement system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6452111 Interoperable CNC System for Turning Operations
Authors: Yusri Yusof, Stephen Newman, Aydin Nassehi, Keith Case
Abstract:
The changing economic climate has made global manufacturing a growing reality over the last decade, forcing companies from east and west and all over the world to collaborate beyond geographic boundaries in the design, manufacture and assemble of products. The ISO10303 and ISO14649 Standards (STEP and STEP-NC) have been developed to introduce interoperability into manufacturing enterprises so as to meet the challenge of responding to production on demand. This paper describes and illustrates a STEP compliant CAD/CAPP/CAM System for the manufacture of rotational parts on CNC turning centers. The information models to support the proposed system together with the data models defined in the ISO14649 standard used to create the NC programs are also described. A structured view of a STEP compliant CAD/CAPP/CAM system framework supporting the next generation of intelligent CNC controllers for turn/mill component manufacture is provided. Finally a proposed computational environment for a STEP-NC compliant system for turning operations (SCSTO) is described. SCSTO is the experimental part of the research supported by the specification of information models and constructed using a structured methodology and object-oriented methods. SCSTO was developed to generate a Part 21 file based on machining features to support the interactive generation of process plans utilizing feature extraction. A case study component has been developed to prove the concept for using the milling and turning parts of ISO14649 to provide a turn-mill CAD/CAPP/CAM environment. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19902110 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components
Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich
Abstract:
This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.
Keywords: Hard disk drive, line balancing, simulation, Arena program.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11862109 User Pattern Learning Algorithm based MDSS(Medical Decision Support System) Framework under Ubiquitous
Authors: Insung Jung, Gi-Nam Wang
Abstract:
In this paper, we present user pattern learning algorithm based MDSS (Medical Decision support system) under ubiquitous. Most of researches are focus on hardware system, hospital management and whole concept of ubiquitous environment even though it is hard to implement. Our objective of this paper is to design a MDSS framework. It helps to patient for medical treatment and prevention of the high risk patient (COPD, heart disease, Diabetes). This framework consist database, CAD (Computer Aided diagnosis support system) and CAP (computer aided user vital sign prediction system). It can be applied to develop user pattern learning algorithm based MDSS for homecare and silver town service. Especially this CAD has wise decision making competency. It compares current vital sign with user-s normal condition pattern data. In addition, the CAP computes user vital sign prediction using past data of the patient. The novel approach is using neural network method, wireless vital sign acquisition devices and personal computer DB system. An intelligent agent based MDSS will help elder people and high risk patients to prevent sudden death and disease, the physician to get the online access to patients- data, the plan of medication service priority (e.g. emergency case).Keywords: Neural network, U-healthcare, MDSS, CAP, DSS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18392108 Seismic Protection of Automated Stocker System by Customized Viscous Fluid Dampers
Authors: Y. P. Wang, J. K. Chen, C. H. Lee, G. H. Huang, M. C. Wang, S. W. Chen, Y. T. Kuan, H. C. Lin, C. Y. Huang, W. H. Liang, W. C. Lin, H. C. Yu
Abstract:
The hi-tech industries in the Science Park at southern Taiwan were heavily damaged by a strong earthquake early 2016. The financial loss in this event was attributed primarily to the automated stocker system handling fully processed products, and recovery of the automated stocker system from the aftermath proved to contribute major lead time. Therefore, development of effective means for protection of stockers against earthquakes has become the highest priority for risk minimization and business continuity. This study proposes to mitigate the seismic response of the stockers by introducing viscous fluid dampers in between the ceiling and the top of the stockers. The stocker is expected to vibrate less violently with a passive control force on top. Linear damper is considered in this application with an optimal damping coefficient determined from a preliminary parametric study. The damper is small in size in comparison with those adopted for building or bridge applications. Component test of the dampers has been carried out to make sure they meet the design requirement. Shake table tests have been further conducted to verify the proposed scheme under realistic earthquake conditions. Encouraging results have been achieved by effectively reducing the seismic responses of up to 60% and preventing the FOUPs from falling off the shelves that would otherwise be the case if left unprotected. Effectiveness of adopting a viscous fluid damper for seismic control of the stocker on top against the ceiling has been confirmed. This technique has been adopted by Macronix International Co., LTD for seismic retrofit of existing stockers. Demonstrative projects on the application of the proposed technique are planned underway for other companies in the display industry as well.
Keywords: Hi-tech industries, seismic protection, automated stocker system, viscous fluid damper.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9742107 Feature Analysis of Predictive Maintenance Models
Authors: Zhaoan Wang
Abstract:
Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.
Keywords: Automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20052106 An Evaluation on the Effectiveness of a 3D Printed Composite Compression Mold
Authors: Peng Hao Wang, Garam Kim, Ronald Sterkenburg
Abstract:
The applications of composite materials within the aviation industry has been increasing at a rapid pace. However, the growing applications of composite materials have also led to growing demand for more tooling to support its manufacturing processes. Tooling and tooling maintenance represents a large portion of the composite manufacturing process and cost. Therefore, the industry’s adaptability to new techniques for fabricating high quality tools quickly and inexpensively will play a crucial role in composite material’s growing popularity in the aviation industry. One popular tool fabrication technique currently being developed involves additive manufacturing such as 3D printing. Although additive manufacturing and 3D printing are not entirely new concepts, the technique has been gaining popularity due to its ability to quickly fabricate components, maintain low material waste, and low cost. In this study, a team of Purdue University School of Aviation and Transportation Technology (SATT) faculty and students investigated the effectiveness of a 3D printed composite compression mold. A 3D printed composite compression mold was fabricated by 3D scanning a steel valve cover of an aircraft reciprocating engine. The 3D printed composite compression mold was used to fabricate carbon fiber versions of the aircraft reciprocating engine valve cover. The 3D printed composite compression mold was evaluated for its performance, durability, and dimensional stability while the fabricated carbon fiber valve covers were evaluated for its accuracy and quality. The results and data gathered from this study will determine the effectiveness of the 3D printed composite compression mold in a mass production environment and provide valuable information for future understanding, improvements, and design considerations of 3D printed composite molds.
Keywords: Additive manufacturing, carbon fiber, composite tooling, molds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7092105 Intelligent Assistive Methods for Diagnosis of Rheumatoid Arthritis Using Histogram Smoothing and Feature Extraction of Bone Images
Authors: SP. Chokkalingam, K. Komathy
Abstract:
Advances in the field of image processing envision a new era of evaluation techniques and application of procedures in various different fields. One such field being considered is the biomedical field for prognosis as well as diagnosis of diseases. This plethora of methods though provides a wide range of options to select from, it also proves confusion in selecting the apt process and also in finding which one is more suitable. Our objective is to use a series of techniques on bone scans, so as to detect the occurrence of rheumatoid arthritis (RA) as accurately as possible. Amongst other techniques existing in the field our proposed system tends to be more effective as it depends on new methodologies that have been proved to be better and more consistent than others. Computer aided diagnosis will provide more accurate and infallible rate of consistency that will help to improve the efficiency of the system. The image first undergoes histogram smoothing and specification, morphing operation, boundary detection by edge following algorithm and finally image subtraction to determine the presence of rheumatoid arthritis in a more efficient and effective way. Using preprocessing noises are removed from images and using segmentation, region of interest is found and Histogram smoothing is applied for a specific portion of the images. Gray level co-occurrence matrix (GLCM) features like Mean, Median, Energy, Correlation, Bone Mineral Density (BMD) and etc. After finding all the features it stores in the database. This dataset is trained with inflamed and noninflamed values and with the help of neural network all the new images are checked properly for their status and Rough set is implemented for further reduction.
Keywords: Computer Aided Diagnosis, Edge Detection, Histogram Smoothing, Rheumatoid Arthritis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24792104 Development of State Model Theory for External Exclusive NOR Type LFSR Structures
Authors: Afaq Ahmad
Abstract:
Using state space technique and GF(2) theory, a simulation model for external exclusive NOR type LFSR structures is developed. Through this tool a systematic procedure is devised for computing pseudo-random binary sequences from such structures.Keywords: LFSR, external exclusive NOR type, recursivebinary sequence, initial state - next state, state transition matrix.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15972103 Genetic Programming: Principles, Applications and Opportunities for Hydrological Modelling
Authors: Oluwaseun K. Oyebode, Josiah A. Adeyemo
Abstract:
Hydrological modelling plays a crucial role in the planning and management of water resources, most especially in water stressed regions where the need to effectively manage the available water resources is of critical importance. However, due to the complex, nonlinear and dynamic behaviour of hydro-climatic interactions, achieving reliable modelling of water resource systems and accurate projection of hydrological parameters are extremely challenging. Although a significant number of modelling techniques (process-based and data-driven) have been developed and adopted in that regard, the field of hydrological modelling is still considered as one that has sluggishly progressed over the past decades. This is majorly as a result of the identification of some degree of uncertainty in the methodologies and results of techniques adopted. In recent times, evolutionary computation (EC) techniques have been developed and introduced in response to the search for efficient and reliable means of providing accurate solutions to hydrological related problems. This paper presents a comprehensive review of the underlying principles, methodological needs and applications of a promising evolutionary computation modelling technique – genetic programming (GP). It examines the specific characteristics of the technique which makes it suitable to solving hydrological modelling problems. It discusses the opportunities inherent in the application of GP in water related-studies such as rainfall estimation, rainfall-runoff modelling, streamflow forecasting, sediment transport modelling, water quality modelling and groundwater modelling among others. Furthermore, the means by which such opportunities could be harnessed in the near future are discussed. In all, a case for total embracement of GP and its variants in hydrological modelling studies is made so as to put in place strategies that would translate into achieving meaningful progress as it relates to modelling of water resource systems, and also positively influence decision-making by relevant stakeholders.
Keywords: Computational modelling, evolutionary algorithms, genetic programming, hydrological modelling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3329