Search results for: value stream mapping.
204 Simulation of Fluid Flow and Heat Transfer in Inclined Cavity using Lattice Boltzmann Method
Authors: Arash Karimipour, A. Hossein Nezhad, E. Shirani, A. Safaei
Abstract:
In this paper, Lattice Boltzmann Method (LBM) is used to study laminar flow with mixed convection heat transfer inside a two-dimensional inclined lid-driven rectangular cavity with aspect ratio AR = 3. Bottom wall of the cavity is maintained at lower temperature than the top lid, and its vertical walls are assumed insulated. Top lid motion results in fluid motion inside the cavity. Inclination of the cavity causes horizontal and vertical components of velocity to be affected by buoyancy force. To include this effect, calculation procedure of macroscopic properties by LBM is changed and collision term of Boltzmann equation is modified. A computer program is developed to simulate this problem using BGK model of lattice Boltzmann method. The effects of the variations of Richardson number and inclination angle on the thermal and flow behavior of the fluid inside the cavity are investigated. The results are presented as velocity and temperature profiles, stream function contours and isotherms. It is concluded that LBM has good potential to simulate mixed convection heat transfer problems.
Keywords: gravity, inclined lid driven cavity, lattice Boltzmannmethod, mixed convection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954203 A Method to Predict Hemorrhage Disease of Grass Carp Tends
Authors: Zhongxu Chen, Jun Yang, Heyue Mao, Xiaoyu Zheng
Abstract:
Hemorrhage Disease of Grass Carp (HDGC) is a kind of commonly occurring illnesses in summer, and the extremely high death rate result in colossal losses to aquaculture. As the complex connections among each factor which influences aquiculture diseases, there-s no quit reasonable mathematical model to solve the problem at present.A BP neural network which with excellent nonlinear mapping coherence was adopted to establish mathematical model; Environmental factor, which can easily detected, such as breeding density, water temperature, pH and light intensity was set as the main analyzing object. 25 groups of experimental data were used for training and test, and the accuracy of using the model to predict the trend of HDGC was above 80%. It is demonstrated that BP neural network for predicating diseases in HDGC has a particularly objectivity and practicality, thus it can be spread to other aquiculture disease.Keywords: Aquaculture, Hemorrhage Disease of Grass Carp, BP Neural Network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1917202 Unified Fusion Approach with Application to SLAM
Authors: Xinde Li, Xinhan Huang, Min Wang
Abstract:
In this paper, we propose the pre-processor based on the Evidence Supporting Measure of Similarity (ESMS) filter and also propose the unified fusion approach (UFA) based on the general fusion machine coupled with ESMS filter, which improve the correctness and precision of information fusion in any fields of application. Here we mainly apply the new approach to Simultaneous Localization And Mapping (SLAM) of Pioneer II mobile robots. A simulation experiment was performed, where an autonomous virtual mobile robot with sonar sensors evolves in a virtual world map with obstacles. By comparing the result of building map according to the general fusion machine (here DSmT-based fusing machine and PCR5-based conflict redistributor considereded) coupling with ESMS filter and without ESMS filter, it shows the benefit of the selection of the sources as a prerequisite for improvement of the information fusion, and also testifies the superiority of the UFA in dealing with SLAM.Keywords: DSmT, ESMS filter, SLAM, UFA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1350201 Virtual Machines Cooperation for Impatient Jobs under Cloud Paradigm
Authors: Nawfal A. Mehdi, Ali Mamat, Hamidah Ibrahim, Shamala K. Syrmabn
Abstract:
The increase on the demand of IT resources diverts the enterprises to use the cloud as a cheap and scalable solution. Cloud computing promises achieved by using the virtual machine as a basic unite of computation. However, the virtual machine pre-defined settings might be not enough to handle jobs QoS requirements. This paper addresses the problem of mapping jobs have critical start deadlines to virtual machines that have predefined specifications. These virtual machines hosted by physical machines and shared a fixed amount of bandwidth. This paper proposed an algorithm that uses the idle virtual machines bandwidth to increase the quote of other virtual machines nominated as executors to urgent jobs. An algorithm with empirical study have been given to evaluate the impact of the proposed model on impatient jobs. The results show the importance of dynamic bandwidth allocation in virtualized environment and its affect on throughput metric.Keywords: Insufficient bandwidth, virtual machine, cloudprovider, impatient jobs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1680200 Evaluating Portfolio Performance by Highlighting Network Property and the Sharpe Ratio in the Stock Market
Authors: Zahra Hatami, Hesham Ali, David Volkman
Abstract:
Selecting a portfolio for investing is a crucial decision for individuals and legal entities. In the last two decades, with economic globalization, a stream of financial innovations has rushed to the aid of financial institutions. The importance of selecting stocks for the portfolio is always a challenging task for investors. This study aims to create a financial network to identify optimal portfolios using network centralities metrics. This research presents a community detection technique of superior stocks that can be described as an optimal stock portfolio to be used by investors. By using the advantages of a network and its property in extracted communities, a group of stocks was selected for each of the various time periods. The performance of the optimal portfolios was compared to the famous index. Their Sharpe ratio was calculated in a timely manner to evaluate their profit for making decisions. The analysis shows that the selected potential portfolio from stocks with low centrality measurement can outperform the market; however, they have a lower Sharpe ratio than stocks with high centrality scores. In other words, stocks with low centralities could outperform the S&P500 yet have a lower Sharpe ratio than high central stocks.
Keywords: Portfolio management performance, network analysis, centrality measurements, Sharpe ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 409199 Identification of PIP Aquaporin Genes from Wheat
Authors: Sh. A. Yousif, M. Bhave
Abstract:
There is strong evidence that water channel proteins 'aquaporins (AQPs)' are central components in plant-water relations as well as a number of other physiological parameters. We had previously reported the isolation of 24 plasma membrane intrinsic protein (PIP) type AQPs. However, the gene numbers in rice and the polyploid nature of bread wheat indicated a high probability of further genes in the latter. The present work focused on identification of further AQP isoforms in bread wheat. With the use of altered primer design, we identified five genes homologous, designated PIP1;5b, PIP2;9b, TaPIP2;2, TaPIP2;2a, TaPIP2;2b. Sequence alignments indicate PIP1;5b, PIP2;9b are likely to be homeologues of two previously reported genes while the other three are new genes and could be homeologs of each other. The results indicate further AQP diversity in wheat and the sequence data will enable physical mapping of these genes to identify their genomes as well as genetic to determine their association with any quantitative trait loci (QTLs) associated with plant-water relation such as salinity or drought tolerance.Keywords: Aquaporins, homeologues, PIP, wheat
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2036198 A Review of Quality Relationship between IT Processes, IT Products and IT Services
Authors: Whee Yen Wong, Chan Wai Lee, Kim Yeow Tshai
Abstract:
Producing IT products/services required carefully designed. IT development process is intangible and labour intensive. Making optimal use of available resources, both soft (knowledge, skill-set etc.) and hard (computer system, ancillary equipment etc.), is vital if IT development is to achieve sensible economical advantages. Apart from the norm of Project Life Cycle and System Development Life Cycle (SDLC), there is an urgent need to establish a general yet widely acceptable guideline on the most effective and efficient way to precede an IT project in the broader view of Product Life Cycle. The current paper proposes such a framework with two major areas of concern: (1) an integration of IT Products and IT Services within an existing IT Process architecture and; (2) how IT Product and IT Services are built into the framework of Product Life Cycle, Project Life Cycle and SDLC.Keywords: Mapping of Quality Relationship, IT Processes/IT Products/IT Services, Product Life Cycle, System Development Life Cycle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172197 A Fast Neural Algorithm for Serial Code Detection in a Stream of Sequential Data
Authors: Hazem M. El-Bakry, Qiangfu Zhao
Abstract:
In recent years, fast neural networks for object/face detection have been introduced based on cross correlation in the frequency domain between the input matrix and the hidden weights of neural networks. In our previous papers [3,4], fast neural networks for certain code detection was introduced. It was proved in [10] that for fast neural networks to give the same correct results as conventional neural networks, both the weights of neural networks and the input matrix must be symmetric. This condition made those fast neural networks slower than conventional neural networks. Another symmetric form for the input matrix was introduced in [1-9] to speed up the operation of these fast neural networks. Here, corrections for the cross correlation equations (given in [13,15,16]) to compensate for the symmetry condition are presented. After these corrections, it is proved mathematically that the number of computation steps required for fast neural networks is less than that needed by classical neural networks. Furthermore, there is no need for converting the input data into symmetric form. Moreover, such new idea is applied to increase the speed of neural networks in case of processing complex values. Simulation results after these corrections using MATLAB confirm the theoretical computations.
Keywords: Fast Code/Data Detection, Neural Networks, Cross Correlation, real/complex values.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626196 Lean Environmental Management Integration System (LEMIS) Framework Development
Authors: Puvanasvaran, A. P., Suresh V., N. Norazlin
Abstract:
The Lean Environmental Management Integration System (LEMIS) framework development is integration between lean core element and ISO 14001. The curiosity on the relationship between continuous improvement and sustainability of lean implementation has influenced this study toward LEMIS. Characteristic of ISO 14001 standard clauses and core elements of lean principles are explored from past studies and literature reviews. Survey was carried out on ISO 14001 certified companies to examine continual improvement by implementing the ISO 14001 standard. The study found that there is a significant and positive relationship between Lean Principles: value, value stream, flow, pull and perfection with the ISO 14001 requirements. LEMIS is significant to support the continuous improvement and sustainability. The integration system can be implemented to any manufacturing company. It gives awareness on the importance on why organizations need to sustain its environmental management system. In the meantime, the lean principle can be adapted in order to streamline daily activities of the company. Throughout the study, it had proven that there is no sacrifice or trade-off between lean principles with ISO 14001 requirements. The framework developed in the study can be further simplified in the future, especially the method of crossing each sub requirements of ISO 14001 standard with the core elements of Lean principles in this study.
Keywords: LEMIS, ISO 14001, integration, framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2384195 Separation of Polyphenolics and Sugar by Ultrafiltration: Effects of Operating Conditions on Fouling and Diafiltration
Authors: Diqiao S. Wei, M. Hossain, Zaid S. Saleh
Abstract:
Polyphenolics and sugar are the components of many fruit juices. In this work, the performance of ultra-filtration (UF) for separating phenolic compounds from apple juice was studied by performing batch experiments in a membrane module with an area of 0.1 m2 and fitted with a regenerated cellulose membrane of 1 kDa MWCO. The effects of various operating conditions: transmembrane pressure (3, 4, 5 bar), temperature (30, 35, 40 ºC), pH (2, 3, 4, 5), feed concentration (3, 5, 7, 10, 15 ºBrix for apple juice) and feed flow rate (1, 1.5, 1.8 L/min) on the performance were determined. The optimum operating conditions were: transmembrane pressure 4 bar, temperature 30 ºC, feed flow rate 1 – 1.8 L/min, pH 3 and 10 Brix (apple juice). After performing ultrafiltration under these conditions, the concentration of polyphenolics in retentate was increased by a factor of up to 2.7 with up to 70% recovered in the permeate and with approx. 20% of the sugar in that stream.. Application of diafiltration (addition of water to the concentrate) can regain the flux by a factor of 1.5, which has been decreased due to fouling. The material balance performed on the process has shown the amount of deposits on the membrane and the extent of fouling in the system. In conclusion, ultrafiltration has been demonstrated as a potential technology to separate the polyphenolics and sugars from their mixtures and can be applied to remove sugars from fruit juice.Keywords: Fouling, membrane, polyphenols, ultrafiltration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3387194 An Analysis of Digital Forensic Laboratory Development among Malaysia’s Law Enforcement Agencies
Authors: Sarah K. Taylor, Miratun M. Saharuddin, Zabri A. Talib
Abstract:
Cybercrime is on the rise, and yet many Law Enforcement Agencies (LEAs) in Malaysia have no Digital Forensics Laboratory (DFL) to assist them in the attrition and analysis of digital evidence. From the estimated number of 30 LEAs in Malaysia, sadly, only eight of them owned a DFL. All of the DFLs are concentrated in the capital of Malaysia and none at the state level. LEAs are still depending on the national DFL (CyberSecurity Malaysia) even for simple and straightforward cases. A survey was conducted among LEAs in Malaysia owning a DFL to understand their history of establishing the DFL, the challenges that they faced and the significance of the DFL to their case investigation. The results showed that the while some LEAs faced no challenge in establishing a DFL, some of them took seven to 10 years to do so. The reason was due to the difficulty in convincing their management because of the high costs involved. The results also revealed that with the establishment of a DFL, LEAs were better able to get faster forensic result and to meet agency’s timeline expectation. It is also found that LEAs were also able to get more meaningful forensic results on cases that require niche expertise, compared to sending off cases to the national DFL. Other than that, cases are getting more complex, and hence, a continuous stream of budget for equipment and training is inevitable. The result derived from the study is hoped to be used by other LEAs in justifying to their management the benefits of establishing an in-house DFL.
Keywords: Digital forensics, digital forensics laboratory, digital evidence, law enforcement agency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1498193 RFU Based Computational Unit Design For Reconfigurable Processors
Authors: M. Aqeel Iqbal
Abstract:
Fully customized hardware based technology provides high performance and low power consumption by specializing the tasks in hardware but lacks design flexibility since any kind of changes require re-design and re-fabrication. Software based solutions operate with software instructions due to which a great flexibility is achieved from the easy development and maintenance of the software code. But this execution of instructions introduces a high overhead in performance and area consumption. In past few decades the reconfigurable computing domain has been introduced which overcomes the traditional trades-off between flexibility and performance and is able to achieve high performance while maintaining a good flexibility. The dramatic gains in terms of chip performance and design flexibility achieved through the reconfigurable computing systems are greatly dependent on the design of their computational units being integrated with reconfigurable logic resources. The computational unit of any reconfigurable system plays vital role in defining its strength. In this research paper an RFU based computational unit design has been presented using the tightly coupled, multi-threaded reconfigurable cores. The proposed design has been simulated for VLIW based architectures and a high gain in performance has been observed as compared to the conventional computing systems.
Keywords: Configuration Stream, Configuration overhead, Configuration Controller, Reconfigurable devices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621192 Ultimately Bounded Takagi-Sugeno Fuzzy Management in Urban Traffic Stream Mechanism: Multi-Agent Modeling Approach
Authors: Reza Ghasemi, Negin Amiri Hazaveh
Abstract:
In this paper, control methodology based on the selection of the type of traffic light and the period of the green phase to accomplish an optimum balance at intersections is proposed. This balance should be flexible to the static behavior of time, and randomness in a traffic situation; the goal of the proposed method is to reduce traffic volume in transportation, the average delay for each vehicle, and control over the crash of cars. The proposed method was specifically investigated at the intersection through an appropriate timing of traffic lights by sampling a multi-agent system. It consists of a large number of intersections, each of which is considered as an independent agent that exchanges information with each other, and the stability of each agent is provided separately. The robustness against uncertainties, scalability, and stability of the closed-loop overall system are the main merits of the proposed methodology. The simulation results show that the fuzzy intelligent controller in this multi-factor system which is a Takagi-Sugeno (TS) fuzzy is more useful than scheduling in the fixed-time method and it reduces the lengths of vehicles queuing.
Keywords: Fuzzy intelligent controller, traffic-light control, multi-agent systems, state space equations, stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 555191 A Probability based Pair Extension Method in Protein 2-DE Gel Image Analysis
Authors: Yanhua Jin, Won Suk Lee
Abstract:
The two-dimensional gel electrophoresis method (2-DE) is widely used in Proteomics to separate thousands of proteins in a sample. By comparing the protein expression levels of proteins in a normal sample with those in a diseased one, it is possible to identify a meaningful set of marker proteins for the targeted disease. The major shortcomings of this approach involve inherent noises and irregular geometric distortions of spots observed in 2-DE images. Various experimental conditions can be the major causes of these problems. In the protein analysis of samples, these problems eventually lead to incorrect conclusions. In order to minimize the influence of these problems, this paper proposes a partition based pair extension method that performs spot-matching on a set of gel images multiple times and segregates more reliable mapping results which can improve the accuracy of gel image analysis. The improved accuracy of the proposed method is analyzed through various experiments on real 2-DE images of human liver tissues.Keywords: Proteomics, spot-matching, two-dimensionalelectrophoresis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1487190 Prioritizing Influential Factors on the Promotion of Virtual Training System
Authors: Nader Gharibnavaz, Mostafa Mosadeghi, Naser Gharibnavaz
Abstract:
In today's world where everything is rapidly changing and information technology is high in development, many features of culture, society, politic and economy has changed. The advent of information technology and electronic data transmission lead to easy communication and fields like e-learning and e-commerce, are accessible for everyone easily. One of these technologies is virtual training. The "quality" of such kind of education systems is critical. 131 questionnaires were prepared and distributed among university student in Toba University. So the research has followed factors that affect the quality of learning from the perspective of staff, students, professors and this type of university. It is concluded that the important factors in virtual training are the quality of professors, the quality of staff, and the quality of the university. These mentioned factors were the most prior factors in this education system and necessary for improving virtual training.Keywords: Training , Virtual Training, Strategic Positioning, Positioning Mapping, Unique Selling Proposition, Strong Brands, Indoors industry
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1472189 A Modified Decoupled Semi-Analytical Approach Based On SBFEM for Solving 2D Elastodynamic Problems
Authors: M. Fakharian, M. I. Khodakarami
Abstract:
In this paper, a new trend for improvement in semianalytical method based on scale boundaries in order to solve the 2D elastodynamic problems is provided. In this regard, only the boundaries of the problem domain discretization are by specific subparametric elements. Mapping functions are uses as a class of higherorder Lagrange polynomials, special shape functions, Gauss-Lobatto- Legendre numerical integration, and the integral form of the weighted residual method, the matrix is diagonal coefficients in the equations of elastodynamic issues. Differences between study conducted and prior research in this paper is in geometry production procedure of the interpolation function and integration of the different is selected. Validity and accuracy of the present method are fully demonstrated through two benchmark problems which are successfully modeled using a few numbers of DOFs. The numerical results agree very well with the analytical solutions and the results from other numerical methods.
Keywords: 2D Elastodynamic Problems, Lagrange Polynomials, G-L-Lquadrature, Decoupled SBFEM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1986188 Numerical and Infrared Mapping of Temperature in Heat Affected Zone during Plasma Arc Cutting of Mild Steel
Authors: Dalvir Singh, Somnath Chattopadhyaya
Abstract:
During welding or flame cutting of metals, the prediction of heat affected zone (HAZ) is critical. There is need to develop a simple mathematical model to calculate the temperature variation in HAZ and derivative analysis can be used for this purpose. This study presents analytical solution for heat transfer through conduction in mild steel plate. The homogeneous and nonhomogeneous boundary conditions are single variables. The full field analytical solutions of temperature measurement, subjected to local heating source, are derived first by method of separation of variables followed with the experimental visualization using infrared imaging. Based on the present work, it is suggested that appropriate heat input characteristics controls the temperature distribution in and around HAZ.Keywords: Conduction Heat Transfer, Heat Affected Zone (HAZ), Infra-Red Imaging, Numerical Method, Orthogonal Function, Plasma Arc Cutting, Separation of Variables, Temperature Measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1787187 Mapping Semantic Networks to Undirected Networks
Authors: Marko A. Rodriguez
Abstract:
There exists an injective, information-preserving function that maps a semantic network (i.e a directed labeled network) to a directed network (i.e. a directed unlabeled network). The edge label in the semantic network is represented as a topological feature of the directed network. Also, there exists an injective function that maps a directed network to an undirected network (i.e. an undirected unlabeled network). The edge directionality in the directed network is represented as a topological feature of the undirected network. Through function composition, there exists an injective function that maps a semantic network to an undirected network. Thus, aside from space constraints, the semantic network construct does not have any modeling functionality that is not possible with either a directed or undirected network representation. Two proofs of this idea will be presented. The first is a proof of the aforementioned function composition concept. The second is a simpler proof involving an undirected binary encoding of a semantic network.Keywords: general-modeling, multi-relational networks, semantic networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1442186 Generation of Artificial Earthquake Accelerogram Compatible with Spectrum using the Wavelet Packet Transform and Nero-Fuzzy Networks
Authors: Peyman Shadman Heidari, Mohammad Khorasani
Abstract:
The principal purpose of this article is to present a new method based on Adaptive Neural Network Fuzzy Inference System (ANFIS) to generate additional artificial earthquake accelerograms from presented data, which are compatible with specified response spectra. The proposed method uses the learning abilities of ANFIS to develop the knowledge of the inverse mapping from response spectrum to earthquake records. In addition, wavelet packet transform is used to decompose specified earthquake records and then ANFISs are trained to relate the response spectrum of records to their wavelet packet coefficients. Finally, an interpretive example is presented which uses an ensemble of recorded accelerograms to demonstrate the effectiveness of the proposed method.
Keywords: Adaptive Neural Network Fuzzy Inference System, Wavelet Packet Transform, Response Spectrum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2832185 Metamodel for Artefacts in Service Engineering Analysis and Design
Authors: Purnomo Yustianto, Robin Doss
Abstract:
As a process of developing a service system, the term ‘service engineering’ evolves in scope and definition. To achieve an integrated understanding of the process, a general framework and an ontology are required. This paper extends a previously built service engineering framework by exploring metamodels for the framework artefacts based on a foundational ontology and a metamodel landscape. The first part of this paper presents a correlation map between the proposed framework with the ontology as a form of evaluation for the conceptual coverage of the framework. The mapping also serves to characterize the artefacts to be produced for each activity in the framework. The second part describes potential metamodels to be used, from the metamodel landscape, as alternative formats of the framework artefacts. The results suggest that the framework sufficiently covers the ontological concepts, both from general service context and software service context. The metamodel exploration enriches the suggested artefact format from the original eighteen formats to thirty metamodel alternatives.
Keywords: Artefact, framework, service, metamodel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645184 Production of Energetic Nanomaterials by Spray Flash Evaporation
Authors: Martin Klaumünzer, Jakob Hübner, Denis Spitzer
Abstract:
Within this paper, latest results on processing of energetic nanomaterials by means of the Spray Flash Evaporation technique are presented. This technology constitutes a highly effective and continuous way to prepare fascinating materials on the nano- and micro-scale. Within the process, a solution is set under high pressure and sprayed into an evacuated atomization chamber. Subsequent ultrafast evaporation of the solvent leads to an aerosol stream, which is separated by cyclones or filters. No drying gas is required, so the present technique should not be confused with spray dying. Resulting nanothermites, insensitive explosives or propellants and compositions are foreseen to replace toxic (according to REACH) and very sensitive matter in military and civil applications. Diverse examples are given in detail: nano-RDX (n-Cyclotrimethylentrinitramin) and nano-aluminum based systems, mixtures (n-RDX/n-TNT - trinitrotoluene) or even cocrystalline matter like n-CL-20/HMX (Hexanitrohexaazaisowurtzitane/ Cyclotetra-methylentetranitramin). These nanomaterials show reduced sensitivity by trend without losing effectiveness and performance. An analytical study for material characterization was performed by using Atomic Force Microscopy, X-Ray Diffraction, and combined techniques as well as spectroscopic methods. As a matter of course, sensitivity tests regarding electrostatic discharge, impact, and friction are provided.
Keywords: Continuous synthesis, energetic material, nanoscale, nanothermite, nanoexplosive.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1438183 Accrual Based Scheduling for Cloud in Single and Multi Resource System: Study of Three Techniques
Authors: R. Santhosh, T. Ravichandran
Abstract:
This paper evaluates the accrual based scheduling for cloud in single and multi-resource system. Numerous organizations benefit from Cloud computing by hosting their applications. The cloud model provides needed access to computing with potentially unlimited resources. Scheduling is tasks and resources mapping to a certain optimal goal principle. Scheduling, schedules tasks to virtual machines in accordance with adaptable time, in sequence under transaction logic constraints. A good scheduling algorithm improves CPU use, turnaround time, and throughput. In this paper, three realtime cloud services scheduling algorithm for single resources and multiple resources are investigated. Experimental results show Resource matching algorithm performance to be superior for both single and multi-resource scheduling when compared to benefit first scheduling, Migration, Checkpoint algorithms.Keywords: Cloud computing, Scheduling, Migration, Checkpoint, Resource Matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1918182 A Collaborative Platform for Multilingual Ontology Development
Authors: Ahmed Tawfik, Fausto Giunchiglia, Vincenzo Maltese
Abstract:
Ontologies provide a common understanding of a specific domain of interest that can be communicated between people and used as background knowledge for automated reasoning in a wide range of applications. In this paper, we address the design of multilingual ontologies following well-defined knowledge engineering methodologies with the support of novel collaborative development approaches. In particular, we present a collaborative platform which allows ontologies to be developed incrementally in multiple languages. This is made possible via an appropriate mapping between language independent concepts and one lexicalization per language (or a lexical gap in case such lexicalization does not exist). The collaborative platform has been designed to support the development of the Universal Knowledge Core, a multilingual ontology currently in English, Italian, Chinese, Mongolian, Hindi and Bangladeshi. Its design follows a workflow-based development methodology that models resources as a set of collaborative objects and assigns customizable workflows to build and maintain each collaborative object in a community driven manner, with extensive support of modern web 2.0 social and collaborative features.
Keywords: Knowledge Diversity, Knowledge Representation, Ontology Development.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2204181 Masquerade and “What Comes Behind Six Is More Than Seven”: Thoughts on Art History and Visual Culture Research Methods
Authors: Osa D Egonwa
Abstract:
In the 21st century, the disciplinary boundaries of past centuries that we often create through mainstream art historical classification, techniques and sources may have been eroded by visual culture, which seems to provide a more inclusive umbrella for the new ways artists go about the creative process and its resultant commodities. Over the past four decades, artists in Africa have resorted to new materials, techniques and themes which have affected our ways of research on these artists and their art. Frontline artists such as El Anatsui, Yinka Shonibare, Erasmus Onyishi are demonstrating that any material is just suitable for artistic expression. Most of times, these materials come with their own techniques/effects and visual syntax: a combination of materials compounds techniques, formal aesthetic indexes, halo effects, and iconography. This tends to challenge the categories and we lean on to view, think and talk about them. This renders our main stream art historical research methods inadequate, thus suggesting new discursive concepts, terms and theories. This paper proposed the Africanist eclectic methods derived from the dual framework of Masquerade Theory and What Comes Behind Six is More Than Seven. This paper shares thoughts/research on art historical methods, terminological re-alignments on classification/source data, presentational format and interpretation arising from the emergent trends in our subject. The outcome provides useful tools to mediate new thoughts and experiences in recent African art and visual culture.
Keywords: Art Historical Methods, Classifications, Concepts , Re-alignment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 638180 Use of Caffeine and Human Pharmaceutical Compounds to Identify Sewage Contamination
Authors: Jingming Wu, Junqi Yue, Ruikang Hu, Zhaoguang Yang, Lifeng Zhang
Abstract:
Fecal coliform bacteria are widely used as indicators of sewage contamination in surface water. However, there are some disadvantages in these microbial techniques including time consuming (18-48h) and inability in discriminating between human and animal fecal material sources. Therefore, it is necessary to seek a more specific indicator of human sanitary waste. In this study, the feasibility was investigated to apply caffeine and human pharmaceutical compounds to identify the human-source contamination. The correlation between caffeine and fecal coliform was also explored. Surface water samples were collected from upstream, middle-stream and downstream points respectively, along Rochor Canal, as well as 8 locations of Marina Bay. Results indicate that caffeine is a suitable chemical tracer in Singapore because of its easy detection (in the range of 0.30-2.0 ng/mL), compared with other chemicals monitored. Relative low concentrations of human pharmaceutical compounds (< 0.07 ng/mL) in Rochor Canal and Marina Bay water samples make them hard to be detected and difficult to be chemical tracer. However, their existence can help to validate sewage contamination. In addition, it was discovered the high correlation exists between caffeine concentration and fecal coliform density in the Rochor Canal water samples, demonstrating that caffeine is highly related to the human-source contamination.Keywords: Caffeine, Human Pharmaceutical Compounds, Chemical Tracer, Sewage Contamination.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2513179 Event Information Extraction System (EIEE): FSM vs HMM
Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani
Abstract:
Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2317178 An Intelligent System Framework for Generating Activity List of a Project Using WBS Mind map and Semantic Network
Authors: H. Iranmanesh, M. Madadi
Abstract:
Work Breakdown Structure (WBS) is one of the most vital planning processes of the project management since it is considered to be the fundamental of other processes like scheduling, controlling, assigning responsibilities, etc. In fact WBS or activity list is the heart of a project and omission of a simple task can lead to an irrecoverable result. There are some tools in order to generate a project WBS. One of the most powerful tools is mind mapping which is the basis of this article. Mind map is a method for thinking together and helps a project manager to stimulate the mind of project team members to generate project WBS. Here we try to generate a WBS of a sample project involving with the building construction using the aid of mind map and the artificial intelligence (AI) programming language. Since mind map structure can not represent data in a computerized way, we convert it to a semantic network which can be used by the computer and then extract the final WBS from the semantic network by the prolog programming language. This method will result a comprehensive WBS and decrease the probability of omitting project tasks.Keywords: Expert System, Mind map, Semantic network, Work breakdown structure,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2609177 Software Evolution Based Sequence Diagrams Merging
Authors: Zine-Eddine Bouras, Abdelouaheb Talai
Abstract:
The need to merge software artifacts seems inherent to modern software development. Distribution of development over several teams and breaking tasks into smaller, more manageable pieces are an effective means to deal with the kind of complexity. In each case, the separately developed artifacts need to be assembled as efficiently as possible into a consistent whole in which the parts still function as described. In addition, earlier changes are introduced into the life cycle and easier is their management by designers. Interaction-based specifications such as UML sequence diagrams have been found effective in this regard. As a result, sequence diagrams can be used not only for capturing system behaviors but also for merging changes in order to create a new version. The objective of this paper is to suggest a new approach to deal with the problem of software merging at the level of sequence diagrams by using the concept of dependence analysis that captures, formally, all mapping, and differences between elements of sequence diagrams and serves as a key concept to create a new version of sequence diagram.Keywords: System behaviors, sequence diagram merging, dependence analysis, sequence diagram slicing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762176 Application of UAS in Forest Firefighting for Detecting Ignitions and 3D Fuel Volume Estimation
Authors: Artur Krukowski, Emmanouela Vogiatzaki
Abstract:
The article presents results from the AF3 project “Advanced Forest Fire Fighting” focused on Unmanned Aircraft Systems (UAS)-based 3D surveillance and 3D area mapping using high-resolution photogrammetric methods from multispectral imaging, also taking advantage of the 3D scanning techniques from the SCAN4RECO project. We also present a proprietary embedded sensor system used for the detection of fire ignitions in the forest using near-infrared based scanner with weight and form factors allowing it to be easily deployed on standard commercial micro-UAVs, such as DJI Inspire or Mavic. Results from real-life pilot trials in Greece, Spain, and Israel demonstrated added-value in the use of UAS for precise and reliable detection of forest fires, as well as high-resolution 3D aerial modeling for accurate quantification of human resources and equipment required for firefighting.
Keywords: Forest wildfires, fuel volume estimation, 3D modeling, UAV, surveillance, firefighting, ignition detectors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 582175 An Efficient Graph Query Algorithm Based on Important Vertices and Decision Features
Authors: Xiantong Li, Jianzhong Li
Abstract:
Graph has become increasingly important in modeling complicated structures and schemaless data such as proteins, chemical compounds, and XML documents. Given a graph query, it is desirable to retrieve graphs quickly from a large database via graph-based indices. Different from the existing methods, our approach, called VFM (Vertex to Frequent Feature Mapping), makes use of vertices and decision features as the basic indexing feature. VFM constructs two mappings between vertices and frequent features to answer graph queries. The VFM approach not only provides an elegant solution to the graph indexing problem, but also demonstrates how database indexing and query processing can benefit from data mining, especially frequent pattern mining. The results show that the proposed method not only avoids the enumeration method of getting subgraphs of query graph, but also effectively reduces the subgraph isomorphism tests between the query graph and graphs in candidate answer set in verification stage.Keywords: Decision Feature, Frequent Feature, Graph Dataset, Graph Query
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871