Search results for: Point process.
1366 Categorizing Search Result Records Using Word Sense Disambiguation
Authors: R. Babisaraswathi, N. Shanthi, S. S. Kiruthika
Abstract:
Web search engines are designed to retrieve and extract the information in the web databases and to return dynamic web pages. The Semantic Web is an extension of the current web in which it includes semantic content in web pages. The main goal of semantic web is to promote the quality of the current web by changing its contents into machine understandable form. Therefore, the milestone of semantic web is to have semantic level information in the web. Nowadays, people use different keyword- based search engines to find the relevant information they need from the web. But many of the words are polysemous. When these words are used to query a search engine, it displays the Search Result Records (SRRs) with different meanings. The SRRs with similar meanings are grouped together based on Word Sense Disambiguation (WSD). In addition to that semantic annotation is also performed to improve the efficiency of search result records. Semantic Annotation is the process of adding the semantic metadata to web resources. Thus the grouped SRRs are annotated and generate a summary which describes the information in SRRs. But the automatic semantic annotation is a significant challenge in the semantic web. Here ontology and knowledge based representation are used to annotate the web pages.
Keywords: Ontology, Semantic Web, WordNet, Word Sense Disambiguation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17611365 Removal of Heavy Metals from Water in the Presence of Organic Wastes: Fruit Peels
Authors: Berk Kılıç, Derin Dalgıç, Ela Mia Sevilla Levi, Ömer Aydın
Abstract:
In this experiment our goal was to remove heavy metals from water. Generally, removing toxic heavy elements: Cu+2, Cr+6 and Fe+3, ions from their aqueous solutions has been determined with different kinds of plants’ peels. However, this study focuses on banana, peach, orange, and potato peels. The first step of the experiment was to wash the peels with distilled water and then dry the peels in an oven for 80 h at 80 °C. The peels were washed with NaOH and dried again at 80 °C for 2 days. Once the peels were washed and dried, 0.4 grams were weighed and added to a 200 mL sample of 0.1% heavy metal solution by mass. The mixing process was done via a magnetic stirrer. A sample of each was taken at 15-minute intervals and the level of absorbance change of the solutions was detected using a UV-Vis Spectrophotometer. Among the used waste products, orange showed the best results, followed by banana peel as the most efficient for our purposes. Moreover, the amount of fruit peel, pH values of the initial heavy metal solution, and initial concentration of heavy metal solutions were investigated to determine the effectiveness of fruit peels for absorbency.
Keywords: Absorbance, heavy metal, removal of heavy metals, fruit peels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611364 Lowering Error Floors by Concatenation of Low-Density Parity-Check and Array Code
Authors: Cinna Soltanpur, Mohammad Ghamari, Behzad Momahed Heravi, Fatemeh Zare
Abstract:
Low-density parity-check (LDPC) codes have been shown to deliver capacity approaching performance; however, problematic graphical structures (e.g. trapping sets) in the Tanner graph of some LDPC codes can cause high error floors in bit-error-ratio (BER) performance under conventional sum-product algorithm (SPA). This paper presents a serial concatenation scheme to avoid the trapping sets and to lower the error floors of LDPC code. The outer code in the proposed concatenation is the LDPC, and the inner code is a high rate array code. This approach applies an interactive hybrid process between the BCJR decoding for the array code and the SPA for the LDPC code together with bit-pinning and bit-flipping techniques. Margulis code of size (2640, 1320) has been used for the simulation and it has been shown that the proposed concatenation and decoding scheme can considerably improve the error floor performance with minimal rate loss.Keywords: Concatenated coding, low–density parity–check codes, array code, error floors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9921363 Shadow Imaging Study of Z-Pinch Dynamic Hohlraum
Authors: Chen Faxin, Feng Jinghua, Yang Jianlun, Li Linbo, Zhou Lin
Abstract:
In order to obtaining the dynamic evolution image of Tungsten array for foam padding, and to research the form of interaction between Tungsten plasma and foam column, a shadow imaging system of four-frame ultraviolet probe laser (266nm)has been designed on 1MA pulse power device. The time resolution of the system is 2.5ns, and static space resolution is superior to 70μm. The radial shadowgraphy image reveals the whole process from the melting and expansion of solid wire to the interaction of the precursor plasma and the foam, from the pinch to rebound inflation. The image shows the continuous interaction of Tungsten plasma and foam in a form of “Raining" within a time of about 50ns, the plasma shell structure has not been found in the whole period of pinch. The quantitative analysis indicates the minimum pinching speed of the foam column is 1.0×106cm/s, and maximum pinching speed is 6.0×106cm/s, and the axial stagnation diameter is approx 1mm.
Keywords: Dynamic hohlraum, Shadowgraphy image, Foam evolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19171362 Inter-Organizational Knowledge Transfer Through Malaysia E-government IT Outsourcing: A Theoretical Review
Authors: Nor Aziati Abdul Hamid, Juhana Salim
Abstract:
The main objective of this paper is to contribute the existing knowledge transfer and IT Outsourcing literature specifically in the context of Malaysia by reviewing the current practices of e-government IT outsourcing in Malaysia including the issues and challenges faced by the public agencies in transferring the knowledge during the engagement. This paper discusses various factors and different theoretical model of knowledge transfer starting from the traditional model to the recent model suggested by the scholars. The present paper attempts to align organizational knowledge from the knowledge-based view (KBV) and organizational learning (OL) lens. This review could help shape the direction of both future theoretical and empirical studies on inter-firm knowledge transfer specifically on how KBV and OL perspectives could play significant role in explaining the complex relationships between the client and vendor in inter-firm knowledge transfer and the role of organizational management information system and Transactive Memory System (TMS) to facilitate the organizational knowledge transferring process. Conclusion is drawn and further research is suggested.Keywords: E-government, IT Outsourcing, Knowledge Management, Knowledge Transfer
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23671361 Physical and Mechanical Phenomena Associated with Rock Failure in Brazilian Disc Specimens
Authors: Hamid Reza Nejati, Amin Nazerigivi, Ahmad Reza Sayadi
Abstract:
Failure mechanism of rocks is one of the fundamental aspects to study rock engineering stability. Rock is a material that contains flaws, initial damage, micro-cracks, etc. Failure of rock structure is largely due to tensile stress and was influenced by various parameters. In the present study, the effect of brittleness and loading rate on the physical and mechanical phenomena produced in rock during loading sequences is considered. For this purpose, Acoustic Emission (AE) technique is used to monitor fracturing process of three rock types (onyx marble, sandstone and soft limestone) with different brittleness and sandstone samples under different loading rate. The results of experimental tests revealed that brittleness and loading rate have a significant effect on the mode and number of induced fracture in rocks. An increase in rock brittleness increases the frequency of induced cracks, and the number of tensile fracture decreases when loading rate increases.Keywords: Brittleness, loading rate, acoustic emission, tensile fracture, shear fracture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14191360 Student Feedback and Its Impact on Fostering the Quality of Teaching at the Academia
Authors: S. Vanker, A. Aaver, A. Roio, L. Nuut
Abstract:
To be sure about the effective and less effective/ineffective approaches to course instruction, we hold the opinion that the faculty members need regular feedback from their students in order to be aware of how well or unwell their teaching styles have worked when instructing the courses. It can be confirmed without a slightest hesitation that undergraduate students’ motivated-ness can be sustained when continually improving the quality of teaching and properly sequencing the academic courses both, in the curricula and timetables. At Estonian Aviation Academy, four different forms of feedback are used: Lecture monitoring, questionnaires for all students, study information system subject monitoring and direct feedback received by the lecturer. Questionnaires for all students are arranged once during a study year and separately for the first year and senior students. The results are discussed in academic departments together with student representatives, analyzed with the teaching staff and, if needed, improvements are suggested. In addition, a monitoring system is planned where a lecturer acts in both roles – as an observer and as the lecturer. This will foster better exchange of experience and through this help to make the whole study process more interesting.Keywords: Student support, learner motivation, feedback, undergraduate education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11751359 Dynamic Metrics for Polymorphism in Object Oriented Systems
Authors: Parvinder Singh Sandhu, Gurdev Singh
Abstract:
Metrics is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules. Software metrics are instruments or ways to measuring all the aspect of software product. These metrics are used throughout a software project to assist in estimation, quality control, productivity assessment, and project control. Object oriented software metrics focus on measurements that are applied to the class and other characteristics. These measurements convey the software engineer to the behavior of the software and how changes can be made that will reduce complexity and improve the continuing capability of the software. Object oriented software metric can be classified in two types static and dynamic. Static metrics are concerned with all the aspects of measuring by static analysis of software and dynamic metrics are concerned with all the measuring aspect of the software at run time. Major work done before, was focusing on static metric. Also some work has been done in the field of dynamic nature of the software measurements. But research in this area is demanding for more work. In this paper we give a set of dynamic metrics specifically for polymorphism in object oriented system.Keywords: Metrics, Software, Quality, Object oriented system, Polymorphism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17621358 Ensembling Adaptively Constructed Polynomial Regression Models
Authors: Gints Jekabsons
Abstract:
The approach of subset selection in polynomial regression model building assumes that the chosen fixed full set of predefined basis functions contains a subset that is sufficient to describe the target relation sufficiently well. However, in most cases the necessary set of basis functions is not known and needs to be guessed – a potentially non-trivial (and long) trial and error process. In our research we consider a potentially more efficient approach – Adaptive Basis Function Construction (ABFC). It lets the model building method itself construct the basis functions necessary for creating a model of arbitrary complexity with adequate predictive performance. However, there are two issues that to some extent plague the methods of both the subset selection and the ABFC, especially when working with relatively small data samples: the selection bias and the selection instability. We try to correct these issues by model post-evaluation using Cross-Validation and model ensembling. To evaluate the proposed method, we empirically compare it to ABFC methods without ensembling, to a widely used method of subset selection, as well as to some other well-known regression modeling methods, using publicly available data sets.Keywords: Basis function construction, heuristic search, modelensembles, polynomial regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16721357 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective
Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou
Abstract:
The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.
Keywords: Mortality map, spatial patterns, statistical area, variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9891356 Content Based Image Retrieval of Brain MR Images across Different Classes
Authors: Abraham Varghese, Kannan Balakrishnan, Reji R. Varghese, Joseph S. Paul
Abstract:
Magnetic Resonance Imaging play a vital role in the decision-diagnosis process of brain MR images. For an accurate diagnosis of brain related problems, the experts mostly compares both T1 and T2 weighted images as the information presented in these two images are complementary. In this paper, rotational and translational invariant form of Local binary Pattern (LBP) with additional gray scale information is used to retrieve similar slices of T1 weighted images from T2 weighted images or vice versa. The incorporation of additional gray scale information on LBP can extract more local texture information. The accuracy of retrieval can be improved by extracting moment features of LBP and reweighting the features based on users feedback. Here retrieval is done in a single subject scenario where similar images of a particular subject at a particular level are retrieved, and multiple subjects scenario where relevant images at a particular level across the subjects are retrieved.
Keywords: Local Binary pattern (LBP), Modified Local Binary pattern (MOD-LBP), T1 and T2 weighted images, Moment features.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23801355 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping
Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa
Abstract:
The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.
Keywords: Neural network computing, information processing, input-output mapping, training time, computers with high memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13221354 Production Line Layout Planning Based on Complexity Measurement
Authors: Guoliang Fan, Aiping Li, Nan Xie, Liyun Xu, Xuemei Liu
Abstract:
Mass customization production increases the difficulty of the production line layout planning. The material distribution process for variety of parts is very complex, which greatly increases the cost of material handling and logistics. In response to this problem, this paper presents an approach of production line layout planning based on complexity measurement. Firstly, by analyzing the influencing factors of equipment layout, the complexity model of production line is established by using information entropy theory. Then, the cost of the part logistics is derived considering different variety of parts. Furthermore, the function of optimization including two objectives of the lowest cost, and the least configuration complexity is built. Finally, the validity of the function is verified in a case study. The results show that the proposed approach may find the layout scheme with the lowest logistics cost and the least complexity. Optimized production line layout planning can effectively improve production efficiency and equipment utilization with lowest cost and complexity.
Keywords: Production line, layout planning, complexity measurement, optimization, mass customization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10871353 A Text Clustering System based on k-means Type Subspace Clustering and Ontology
Authors: Liping Jing, Michael K. Ng, Xinhua Yang, Joshua Zhexue Huang
Abstract:
This paper presents a text clustering system developed based on a k-means type subspace clustering algorithm to cluster large, high dimensional and sparse text data. In this algorithm, a new step is added in the k-means clustering process to automatically calculate the weights of keywords in each cluster so that the important words of a cluster can be identified by the weight values. For understanding and interpretation of clustering results, a few keywords that can best represent the semantic topic are extracted from each cluster. Two methods are used to extract the representative words. The candidate words are first selected according to their weights calculated by our new algorithm. Then, the candidates are fed to the WordNet to identify the set of noun words and consolidate the synonymy and hyponymy words. Experimental results have shown that the clustering algorithm is superior to the other subspace clustering algorithms, such as PROCLUS and HARP and kmeans type algorithm, e.g., Bisecting-KMeans. Furthermore, the word extraction method is effective in selection of the words to represent the topics of the clusters.
Keywords: Subspace Clustering, Text Mining, Feature Weighting, Cluster Interpretation, Ontology
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24611352 Concrete Recycling in Egypt for Construction Applications: A technical and Financial Feasibility Model
Authors: Omar Farahat Hassanein, A. Samer Ezeldin
Abstract:
The construction industry is a very dynamic field. Every day new technologies and methods are developed to fasten the process and increase its efficiency. Hence, if a project uses fewer resources it will be more efficient.
This paper examines the recycling of concrete construction and demolition (C&D) waste to reuse it as aggregates in on-site applications for construction projects in Egypt and possibly in the Middle East. The study focuses on a stationary plant setting. The machinery set-up used in the plant is analyzed technically and financially.
The findings are gathered and grouped to obtain a comprehensive cost-benefit financial model to demonstrate the feasibility of establishing and operating a concrete recycling plant. Furthermore, a detailed business plan including the time and hierarchy is proposed.
Keywords: Construction wastes, recycling, sustainability, financial model, concrete recycling, concrete life cycle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33281351 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-Time
Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl
Abstract:
In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method as a Web-App is developed for auto-generated data replication to provide a twin of the targeted data structure. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi", has been developed. A special login form has been developed with a special instance of the data validation; this verification process secures the web application from its early stages. The system has been tested and validated, and up to 99% of SQLi attacks have been prevented.
Keywords: SQL injection, attacks, web application, accuracy, database, WebAppShield.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4421350 Flocculation on the Treatment of Olive Oil Mill Wastewater: Pretreatment
Authors: G. Hodaifa, J. A. Páez, C. Agabo, E. Ramos, J. C. Gutiérrez, A. Rosal
Abstract:
Currently, continuous two-phase decanter process used for olive oil production is the more internationally widespread. The wastewaters generated from this industry (OMW) are a real environmental problem because of its high organic load. Among proposed treatments for these wastewaters, advanced oxidation technologies (Fenton, ozone, photoFenton, etc.) are the most favourable. The direct application of these processes is somewhat expensive. Therefore, the application of a previous stage based on a flocculation-sedimentation operation is of high importance. In this research five commercial flocculants (three cationic, and two anionic) have been used to achieve the separation of phases (liquid clarifiedsludge). For each flocculant, different concentrations (0-1000 mg/L) have been studied. In these experiments, sludge volume formed and the final water quality were determined. The final removal percentages of total phenols (11.3-25.1%), COD (5.6-20.4%), total carbon (2.3-26.5%), total organic carbon (1.50-23.8%), total nitrogen (1.45-24.8%), and turbidity (27.9-61.4%) were determined. The variation on electric conductivity reduction percentage (1-8%) was also determined. Finally, the best flocculants with highest removal percentages have been determined (QG2001 and Flocudex CS49).Keywords: Flocculants, flocculation, olive oil mill wastewater, water quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25521349 Some Issues on Integrating Telepresence Technology into Industrial Robotic Assembly
Authors: Gunther Reinhart, Marwan Radi
Abstract:
Since the 1940s, many promising telepresence research results have been obtained. However, telepresence technology still has not reached industrial usage. As human intelligence is necessary for successful execution of most manual assembly tasks, the ability of the human is hindered in some cases, such as the assembly of heavy parts of small/medium lots or prototypes. In such a case of manual assembly, the help of industrial robots is mandatory. The telepresence technology can be considered as a solution for performing assembly tasks, where the human intelligence and haptic sense are needed to identify and minimize the errors during an assembly process and a robot is needed to carry heavy parts. In this paper, preliminary steps to integrate the telepresence technology into industrial robot systems are introduced. The system described here combines both, the human haptic sense and the industrial robot capability to perform a manual assembly task remotely using a force feedback joystick. Mapping between the joystick-s Degrees of Freedom (DOF) and the robot-s ones are introduced. Simulation and experimental results are shown and future work is discussed.Keywords: Assembly, Force Feedback, Industrial Robot, Teleassembly, Telepresence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12431348 Preparation and Investigation of Photocatalytic Properties of ZnO Nanocrystals: Effect of Operational Parameters and Kinetic Study
Authors: N. Daneshvar, S. Aber, M. S. Seyed Dorraji, A. R. Khataee, M. H. Rasoulifard
Abstract:
ZnO nanocrystals with mean diameter size 14 nm have been prepared by precipitation method, and examined as photocatalyst for the UV-induced degradation of insecticide diazinon as deputy of organic pollutant in aqueous solution. The effects of various parameters, such as illumination time, the amount of photocatalyst, initial pH values and initial concentration of insecticide on the photocatalytic degradation diazinon were investigated to find desired conditions. In this case, the desired parameters were also tested for the treatment of real water containing the insecticide. Photodegradation efficiency of diazinon was compared between commercial and prepared ZnO nanocrystals. The results indicated that UV/ZnO process applying prepared nanocrystalline ZnO offered electrical energy efficiency and quantum yield better than commercial ZnO. The present study, on the base of Langmuir-Hinshelwood mechanism, illustrated a pseudo first-order kinetic model with rate constant of surface reaction equal to 0.209 mg l-1 min-1 and adsorption equilibrium constant of 0.124 l mg-1.Keywords: Zinc oxide nanopowder, Electricity consumption, Quantum yield, Nanoparticles, Photodegradation, Kinetic model, Insecticide.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35661347 Deterministic Random Number Generator Algorithm for Cryptosystem Keys
Authors: Adi A. Maaita, Hamza A. A. Al_Sewadi
Abstract:
One of the crucial parameters of digital cryptographic systems is the selection of the keys used and their distribution. The randomness of the keys has a strong impact on the system’s security strength being difficult to be predicted, guessed, reproduced, or discovered by a cryptanalyst. Therefore, adequate key randomness generation is still sought for the benefit of stronger cryptosystems. This paper suggests an algorithm designed to generate and test pseudo random number sequences intended for cryptographic applications. This algorithm is based on mathematically manipulating a publically agreed upon information between sender and receiver over a public channel. This information is used as a seed for performing some mathematical functions in order to generate a sequence of pseudorandom numbers that will be used for encryption/decryption purposes. This manipulation involves permutations and substitutions that fulfill Shannon’s principle of “confusion and diffusion”. ASCII code characters were utilized in the generation process instead of using bit strings initially, which adds more flexibility in testing different seed values. Finally, the obtained results would indicate sound difficulty of guessing keys by attackers.Keywords: Cryptosystems, Information Security agreement, Key distribution, Random numbers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34301346 Modified Fuzzy ARTMAP and Supervised Fuzzy ART: Comparative Study with Multispectral Classification
Authors: F.Alilat, S.Loumi, H.Merrad, B.Sansal
Abstract:
In this article a modification of the algorithm of the fuzzy ART network, aiming at returning it supervised is carried out. It consists of the search for the comparison, training and vigilance parameters giving the minimum quadratic distances between the output of the training base and those obtained by the network. The same process is applied for the determination of the parameters of the fuzzy ARTMAP giving the most powerful network. The modification consist in making learn the fuzzy ARTMAP a base of examples not only once as it is of use, but as many time as its architecture is in evolution or than the objective error is not reached . In this way, we don-t worry about the values to impose on the eight (08) parameters of the network. To evaluate each one of these three networks modified, a comparison of their performances is carried out. As application we carried out a classification of the image of Algiers-s bay taken by SPOT XS. We use as criterion of evaluation the training duration, the mean square error (MSE) in step control and the rate of good classification per class. The results of this study presented as curves, tables and images show that modified fuzzy ARTMAP presents the best compromise quality/computing time.
Keywords: Neural Networks, fuzzy ART, fuzzy ARTMAP, Remote sensing, multispectral Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13611345 An Active Mixer with Vertical Flow Placement via a Series of Inlets for Micromixing
Authors: Pil Woo Heo, In Sub Park
Abstract:
Flows in a microchannel are laminar, which means that mixing depends on only inter-diffusion. A micromixer plays an important role in obtaining fast diagnosis results in the fields of m-TAS (total analysis system), Bio-MEMS and LOC (lab-on-a-chip).
In this paper, we propose a new active mixer with vertical flow placement via a series of inlets for micromixing. This has two inlets on the same axis, one of which is located before the other. The sample input by the first inlet flows into the down-position, while the other sample by the second inlet flows into the up-position. In the experiment, the samples were located vertically in up-down positions in a micro chamber. PZT was attached below a chamber, and ultrasonic waves were radiated in the down to up direction towards the samples in the micro chamber in order to accelerate the mixing. The mixing process was measured by the change of color in a micro chamber using phenolphthalein and NaOH. The results of the experiment showed that the samples in the microchamber were efficiently mixed and that our new active mixer was superior to the horizontal type of active mixers in view of the grey levels and the standard deviation.
Keywords: Active mixer, vertical flow placement, microchannel, bio-MEMS, LOC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17621344 Natural Preservatives: An Alternative for Chemical Preservative Used in Foods
Authors: Zerrin Erginkaya, Gözde Konuray
Abstract:
Microbial degradation of foods is defined as a decrease of food safety due to microorganism activity. Organic acids, sulfur dioxide, sulfide, nitrate, nitrite, dimethyl dicarbonate and several preservative gases have been used as chemical preservatives in foods as well as natural preservatives which are indigenous in foods. It is determined that usage of herbal preservatives such as blueberry, dried grape, prune, garlic, mustard, spices inhibited several microorganisms. Moreover, it is determined that animal origin preservatives such as whey, honey, lysosomes of duck egg and chicken egg, chitosan have antimicrobial effect. Other than indigenous antimicrobials in foods, antimicrobial agents produced by microorganisms could be used as natural preservatives. The antimicrobial feature of preservatives depends on the antimicrobial spectrum, chemical and physical features of material, concentration, mode of action, components of food, process conditions, and pH and storage temperature. In this review, studies about antimicrobial components which are indigenous in food (such as herbal and animal origin antimicrobial agents), antimicrobial materials synthesized by microorganisms, and their usage as an antimicrobial agent to preserve foods are discussed.
Keywords: Animal origin preservatives, antimicrobial, chemical preservatives, herbal preservatives.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26091343 Optimum Conditions for Effective Decomposition of Toluene as VOC Gas by Pilot-Scale Regenerative Thermal Oxidizer
Authors: S. Iijima, K. Nakayama, D. Kuchar, M. Kubota, H. Matsuda
Abstract:
Regenerative Thermal Oxidizer (RTO) is one of the best solutions for removal of Volatile Organic Compounds (VOC) from industrial processes. In the RTO, VOC in a raw gas are usually decomposed at 950-1300 K and the combustion heat of VOC is recovered by regenerative heat exchangers charged with ceramic honeycombs. The optimization of the treatment of VOC leads to the reduction of fuel addition to VOC decomposition, the minimization of CO2 emission and operating cost as well. In the present work, the thermal efficiency of the RTO was investigated experimentally in a pilot-scale RTO unit using toluene as a typical representative of VOC. As a result, it was recognized that the radiative heat transfer was dominant in the preheating process of a raw gas when the gas flow rate was relatively low. Further, it was found that a minimum heat exchanger volume to achieve self combustion of toluene without additional heating of the RTO by fuel combustion was dependent on both the flow rate of a raw gas and the concentration of toluene. The thermal efficiency calculated from fuel consumption and the decomposed toluene ratio, was found to have a maximum value of 0.95 at a raw gas mass flow rate of 1810 kg·h-1 and honeycombs height of 1.5m.Keywords: Regenerative Heat Exchange, Self Combustion, Toluene, Volatile Organic Compounds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24411342 Active Learning Strategies to Develop Student Skills in Information Systems for Management
Authors: F. Castro Lopes, S. Fernandes
Abstract:
Active learning strategies are at the center of any change process aimed to improve the development of student skills. This paper aims to analyze the impact of teaching strategies, including problem-based learning (PBL), in the curricular unit of information system for management, based on students’ perceptions of how they contribute to develop the desired learning outcomes of the curricular unit. This course is part of the 1st semester and 3rd year of the graduate degree program in management at a private higher education institution in Portugal. The methodology included an online questionnaire to students (n = 40). Findings from students reveal a positive impact of the teaching strategies used. In general, 35% considered that the strategies implemented in the course contributed to the development of courses’ learning objectives. Students considered PBL as the learning strategy that better contributed to enhance the courses’ learning outcomes. This conclusion brings forward the need for further reflection and discussion on the impact of student feedback on teaching and learning processes.
Keywords: Higher education, active learning strategies, skills development, student assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 581341 Low Complexity Multi Mode Interleaver Core for WiMAX with Support for Convolutional Interleaving
Authors: Rizwan Asghar, Dake Liu
Abstract:
A hardware efficient, multi mode, re-configurable architecture of interleaver/de-interleaver for multiple standards, like DVB, WiMAX and WLAN is presented. The interleavers consume a large part of silicon area when implemented by using conventional methods as they use memories to store permutation patterns. In addition, different types of interleavers in different standards cannot share the hardware due to different construction methodologies. The novelty of the work presented in this paper is threefold: 1) Mapping of vital types of interleavers including convolutional interleaver onto a single architecture with flexibility to change interleaver size; 2) Hardware complexity for channel interleaving in WiMAX is reduced by using 2-D realization of the interleaver functions; and 3) Silicon cost overheads reduced by avoiding the use of small memories. The proposed architecture consumes 0.18mm2 silicon area for 0.12μm process and can operate at a frequency of 140 MHz. The reduced complexity helps in minimizing the memory utilization, and at the same time provides strong support to on-the-fly computation of permutation patterns.Keywords: Hardware interleaver implementation, WiMAX, DVB, block interleaver, convolutional interleaver, hardwaremultiplexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20341340 Model-free Prediction based on Tracking Theory and Newton Form of Polynomial
Authors: Guoyuan Qi , Yskandar Hamam, Barend Jacobus van Wyk, Shengzhi Du
Abstract:
The majority of existing predictors for time series are model-dependent and therefore require some prior knowledge for the identification of complex systems, usually involving system identification, extensive training, or online adaptation in the case of time-varying systems. Additionally, since a time series is usually generated by complex processes such as the stock market or other chaotic systems, identification, modeling or the online updating of parameters can be problematic. In this paper a model-free predictor (MFP) for a time series produced by an unknown nonlinear system or process is derived using tracking theory. An identical derivation of the MFP using the property of the Newton form of the interpolating polynomial is also presented. The MFP is able to accurately predict future values of a time series, is stable, has few tuning parameters and is desirable for engineering applications due to its simplicity, fast prediction speed and extremely low computational load. The performance of the proposed MFP is demonstrated using the prediction of the Dow Jones Industrial Average stock index.Keywords: Forecast, model-free predictor, prediction, time series
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17811339 Tidal Current Behaviors and Remarkable Bathymetric Change in the South-Western Part of Khor Abdullah, Kuwait
Authors: Ahmed M. Al-Hasem
Abstract:
A study of the tidal current behavior and bathymetric changes was undertaken in order to establish an information base for future coastal management. The average velocity for tidal current was 0.46 m/s and the maximum velocity was 1.08 m/s during ebb tide. During spring tides, maximum velocities range from 0.90 m/s to 1.08 m/s, whereas maximum velocities vary from 0.40 m/s to 0.60 m/s during neap tides. Despite greater current velocities during flood tide, the bathymetric features enhance the dominance of the ebb tide. This can be related to the abundance of fine sediments from the ebb current approaching the study area, and the relatively coarser sediment from the approaching flood current. Significant bathymetric changes for the period from 1985 to 1998 were found with dominance of erosion process. Approximately 96.5% of depth changes occurred within the depth change classes of -5 m to 5 m. The high erosion processes within the study area will subsequently result in high accretion processes, particularly in the north, the location of the proposed Boubyan Port and its navigation channel.
Keywords: Bathymetric change, Boubyan Island, GIS, Khor Abdullah, tidal current behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11221338 Finite Element Simulation of Multi-Stage Deep Drawing Processes and Comparison with Experimental Results
Authors: A. Pourkamali Anaraki, M. Shahabizadeh, B. Babaee
Abstract:
The plastic forming process of sheet plate takes an important place in forming metals. The traditional techniques of tool design for sheet forming operations used in industry are experimental and expensive methods. Prediction of the forming results, determination of the punching force, blank holder forces and the thickness distribution of the sheet metal will decrease the production cost and time of the material to be formed. In this paper, multi-stage deep drawing simulation of an Industrial Part has been presented with finite element method. The entire production steps with additional operations such as intermediate annealing and springback has been simulated by ABAQUS software under axisymmetric conditions. The simulation results such as sheet thickness distribution, Punch force and residual stresses have been extracted in any stages and sheet thickness distribution was compared with experimental results. It was found through comparison of results, the FE model have proven to be in close agreement with those of experiment.Keywords: Deep drawing, Finite element method, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50761337 Degradation of EE2 by Different Consortium of Enriched Nitrifying Activated Sludge
Authors: Pantip Kayee
Abstract:
17α-ethinylestradiol (EE2) is a recalcitrant micropollutant which is found in small amounts in municipal wastewater. But these small amounts still adversely affect for the reproductive function of aquatic organisms. Evidence in the past suggested that full-scale WWTPs equipped with nitrification process enhanced the removal of EE2 in the municipal wastewater. EE2 has been proven to be able to be transformed by ammonia oxidizing bacteria (AOB) via co-metabolism. This research aims to clarify the EE2 degradation pattern by different consortium of ammonia oxidizing microorganism (AOM) including AOA (ammonia oxidizing archaea) and investigate contribution between the existing ammonia monooxygenase (AMO) and new synthesized AOM. The result showed that AOA or AOB of N. oligotropha cluster in enriched nitrifying activated sludge (NAS) from 2mM and 5mM, commonly found in municipal WWTPs, could degrade EE2 in wastewater via co-metabolism. Moreover, the investigation of the contribution between the existing ammonia monooxygenase (AMO) and new synthesized AOM demonstrated that the new synthesized AMO enzyme may perform ammonia oxidation rather than the existing AMO enzyme or the existing AMO enzyme may has a small amount to oxidize ammonia.
Keywords: 17α-ethinylestradiol, nitrification, ammonia oxidizing bacteria, ammonia oxidizing archaea.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2025