Search results for: N-15 Tracer Techniques
490 Perceptions toward Adopting Virtual Reality as a Learning Aid in Information Technology
Authors: S. Alfalah, J. Falah, T. Alfalah, M. Elfalah, O. Falah
Abstract:
The field of education is an ever-evolving area constantly enriched by newly discovered techniques provided by active research in all areas of technologies. The recent years have witnessed the introduction of a number of promising technologies and applications to enhance the teaching and learning experience. Virtual Reality (VR) applications are considered one of the evolving methods that have contributed to enhancing education in many fields. VR creates an artificial environment, using computer hardware and software, which is similar to the real world. This simulation provides a solution to improve the delivery of materials, which facilitates the teaching process by providing a useful aid to instructors, and enhances the learning experience by providing a beneficial learning aid. In order to assure future utilization of such systems, students’ perceptions were examined toward utilizing VR as an educational tool in the Faculty of Information Technology (IT) in The University of Jordan. A questionnaire was administered to IT undergraduates investigating students’ opinions about the potential opportunities that VR technology could offer and its implications as learning and teaching aid. The results confirmed the end users’ willingness to adopt VR systems as a learning aid. The result of this research forms a solid base for investing in a VR system for IT education.
Keywords: Education, information, technology, virtual reality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1151489 Effect of Size of the Step in the Response Surface Methodology using Nonlinear Test Functions
Authors: Jesús Everardo Olguín Tiznado, Rafael García Martínez, Claudia Camargo Wilson, Juan Andrés López Barreras, Everardo Inzunza González, Javier Ordorica Villalvazo
Abstract:
The response surface methodology (RSM) is a collection of mathematical and statistical techniques useful in the modeling and analysis of problems in which the dependent variable receives the influence of several independent variables, in order to determine which are the conditions under which should operate these variables to optimize a production process. The RSM estimated a regression model of first order, and sets the search direction using the method of maximum / minimum slope up / down MMS U/D. However, this method selects the step size intuitively, which can affect the efficiency of the RSM. This paper assesses how the step size affects the efficiency of this methodology. The numerical examples are carried out through Monte Carlo experiments, evaluating three response variables: efficiency gain function, the optimum distance and the number of iterations. The results in the simulation experiments showed that in response variables efficiency and gain function at the optimum distance were not affected by the step size, while the number of iterations is found that the efficiency if it is affected by the size of the step and function type of test used.Keywords: RSM, dependent variable, independent variables, efficiency, simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991488 Disciplinary Procedures Used by Secondary School Teachers in Calabar Municipality, Nigeria
Authors: N. N. Nkomo, M. L. Mayanchi
Abstract:
The present study investigated various forms of disciplinary procedures or punishment used by teachers in secondary schools in Calabar Municipality, Nigera. There are agitations amongst parents and educators on the use of corporal punishment as a disciplinary measure against children. Those against the use of corporal punishment argue that this form of punishment does not teach, it only terminates behaviour temporarily and inculcates violence. Those in support are of the view that corporal punishment serves as a deterrent to others. This study sought to find out the most common measure of discipline employed by teachers in private and public schools. The study had three objectives, three research questions and two hypotheses. The design of the present study was the ex-post facto descriptive survey, since variables under study were not manipulated by the researcher. Teachers in Calabar Municipal Secondary Schools formed the population. A sample of 160 teachers was used for the study. The data collection instrument was a facts finding questionnaire titled Disciplinary Procedures Inventory. Data collected were analyzed using simple percentages and chi-square. The major findings were that physical measures such as flogging, exercise/drills, and painful postures were commonly used by teachers in secondary schools. It was also found that these measures were more often used in public schools. It was recommended that teachers should rather employ non-violent techniques of discipline than physical punishment.
Keywords: Discipline, non-violent punishment, physical punishment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1902487 Formulation of Mortars with Marine Sediments
Authors: Nor-Edine Abriak, Mouhamadou Amar, Mahfoud Benzerzour
Abstract:
The transition to a more sustainable economy is directed by a reduction in the consumption of raw materials in equivalent production. The recovery of byproducts and especially the dredged sediment as mineral addition in cements matrix represents an alternative to reduce raw material consumption and construction sector’s carbon footprint. However, the efficient use of sediment requires adequate and optimal treatment. Several processing techniques have so far been applied in order to improve some physicochemical properties. The heat treatment by calcination was effective in removing the organic fraction and activates the pozzolanic properties. In this article, the effect of the optimized heat treatment of marine sediments in the physico-mechanical and environmental properties of mortars are shown. A finding is that the optimal substitution of a portion of cement by treated sediments by calcination at 750 °C helps to maintain or improve the mechanical properties of the cement matrix in comparison with a standard reference mortar. The use of calcined sediment enhances mortar behavior in terms of mechanical strength and durability. From an environmental point of view and life cycle, mortars formulated containing treated sediments are considered inert with respect to the inert waste storage facilities reference (ISDI-France).Keywords: Sediment, calcination, cement, reuse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 891486 Optical Fish Tracking in Fishways using Neural Networks
Authors: Alvaro Rodriguez, Maria Bermudez, Juan R. Rabuñal, Jeronimo Puertas
Abstract:
One of the main issues in Computer Vision is to extract the movement of one or several points or objects of interest in an image or video sequence to conduct any kind of study or control process. Different techniques to solve this problem have been applied in numerous areas such as surveillance systems, analysis of traffic, motion capture, image compression, navigation systems and others, where the specific characteristics of each scenario determine the approximation to the problem. This paper puts forward a Computer Vision based algorithm to analyze fish trajectories in high turbulence conditions in artificial structures called vertical slot fishways, designed to allow the upstream migration of fish through obstructions in rivers. The suggested algorithm calculates the position of the fish at every instant starting from images recorded with a camera and using neural networks to execute fish detection on images. Different laboratory tests have been carried out in a full scale fishway model and with living fishes, allowing the reconstruction of the fish trajectory and the measurement of velocities and accelerations of the fish. These data can provide useful information to design more effective vertical slot fishways.
Keywords: Computer Vision, Neural Network, Fishway, Fish Trajectory, Tracking
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2002485 Image Clustering Framework for BAVM Segmentation in 3DRA Images: Performance Analysis
Authors: FH. Sarieddeen, R. El Berbari, S. Imad, J. Abdel Baki, M. Hamad, R. Blanc, A. Nakib, Y.Chenoune
Abstract:
Brain ArterioVenous Malformation (BAVM) is an abnormal tangle of brain blood vessels where arteries shunt directly into veins with no intervening capillary bed which causes high pressure and hemorrhage risk. The success of treatment by embolization in interventional neuroradiology is highly dependent on the accuracy of the vessels visualization. In this paper the performance of clustering techniques on vessel segmentation from 3- D rotational angiography (3DRA) images is investigated and a new technique of segmentation is proposed. This method consists in: preprocessing step of image enhancement, then K-Means (KM), Fuzzy C-Means (FCM) and Expectation Maximization (EM) clustering are used to separate vessel pixels from background and artery pixels from vein pixels when possible. A post processing step of removing false-alarm components is applied before constructing a three-dimensional volume of the vessels. The proposed method was tested on six datasets along with a medical assessment of an expert. Obtained results showed encouraging segmentations.
Keywords: Brain arteriovenous malformation (BAVM), 3-D rotational angiography (3DRA), K-Means (KM) clustering, Fuzzy CMeans (FCM) clustering, Expectation Maximization (EM) clustering, volume rendering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911484 Identification of Most Frequently Occurring Lexis in Winnings-announcing Unsolicited Bulke-mails
Authors: Jatinderkumar R. Saini, Apurva A. Desai
Abstract:
e-mail has become an important means of electronic communication but the viability of its usage is marred by Unsolicited Bulk e-mail (UBE) messages. UBE consists of many types like pornographic, virus infected and 'cry-for-help' messages as well as fake and fraudulent offers for jobs, winnings and medicines. UBE poses technical and socio-economic challenges to usage of e-mails. To meet this challenge and combat this menace, we need to understand UBE. Towards this end, the current paper presents a content-based textual analysis of nearly 3000 winnings-announcing UBE. Technically, this is an application of Text Parsing and Tokenization for an un-structured textual document and we approach it using Bag Of Words (BOW) and Vector Space Document Model techniques. We have attempted to identify the most frequently occurring lexis in the winnings-announcing UBE documents. The analysis of such top 100 lexis is also presented. We exhibit the relationship between occurrence of a word from the identified lexisset in the given UBE and the probability that the given UBE will be the one announcing fake winnings. To the best of our knowledge and survey of related literature, this is the first formal attempt for identification of most frequently occurring lexis in winningsannouncing UBE by its textual analysis. Finally, this is a sincere attempt to bring about alertness against and mitigate the threat of such luring but fake UBE.Keywords: Lexis, Unsolicited Bulk e-mail (UBE), Vector SpaceDocument Model, Winnings, Lottery
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541483 Machine Learning for Music Aesthetic Annotation Using MIDI Format: A Harmony-Based Classification Approach
Authors: Lin Yang, Zhian Mi, Jiacheng Xiao, Rong Li
Abstract:
Swimming with the tide of deep learning, the field of music information retrieval (MIR) experiences parallel development and a sheer variety of feature-learning models has been applied to music classification and tagging tasks. Among those learning techniques, the deep convolutional neural networks (CNNs) have been widespreadly used with better performance than the traditional approach especially in music genre classification and prediction. However, regarding the music recommendation, there is a large semantic gap between the corresponding audio genres and the various aspects of a song that influence user preference. In our study, aiming to bridge the gap, we strive to construct an automatic music aesthetic annotation model with MIDI format for better comparison and measurement of the similarity between music pieces in the way of harmonic analysis. We use the matrix of qualification converted from MIDI files as input to train two different classifiers, support vector machine (SVM) and Decision Tree (DT). Experimental results in performance of a tag prediction task have shown that both learning algorithms are capable of extracting high-level properties in an end-to end manner from music information. The proposed model is helpful to learn the audience taste and then the resulting recommendations are likely to appeal to a niche consumer.
Keywords: Harmonic analysis, machine learning, music classification and tagging, MIDI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 759482 An Application for Risk of Crime Prediction Using Machine Learning
Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento
Abstract:
The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.Keywords: Crime prediction, machine learning, public safety, smart city.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330481 Motivational Antecedents that Influenced a Higher Education Institution in the Philippines to Adopt Enterprise Architecture
Authors: Ma. Eliza Jijeth V. dela Cruz
Abstract:
Technology is a recent prodigy in people’s everyday life that has taken off. It infiltrated almost every aspect of one’s lives, changing how people work, how people learn and how people perceive things. Academic Institutions, just like other organizations, have deeply modified its strategies to integrate technology into the institutional vision and corporate strategy that has never been greater. Information and Communications Technology (ICT) continues to be recognized as a major factor in organizations realizing its aims and objectives. Consequently, ICT has an important role in the mobilization of an academic institution’s strategy to support the delivery of operational, strategic or transformational objectives. This ICT strategy should align the institution with the radical changes of the ICT world through the use of Enterprise Architecture (EA). Hence, EA’s objective is to optimize the islands of legacy processes to be integrated that is receptive to change and supportive of the delivery of the strategy. In this paper, the focus is to explore the motivational antecedents during the adoption of EA in a Higher Education Institution in the Philippines for its ICT strategic plan. The seven antecedents (viewpoint, stakeholders, human traits, vision, revolutionary innovation, techniques and change components) provide understanding into EA adoption and the antecedents that influences the process of EA adoption.
Keywords: Enterprise architecture, adoption, antecedents, higher education institution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 857480 MONPAR - A Page Replacement Algorithm for a Spatiotemporal Database
Authors: U. Kalay, O. Kalıpsız
Abstract:
For a spatiotemporal database management system, I/O cost of queries and other operations is an important performance criterion. In order to optimize this cost, an intense research on designing robust index structures has been done in the past decade. With these major considerations, there are still other design issues that deserve addressing due to their direct impact on the I/O cost. Having said this, an efficient buffer management strategy plays a key role on reducing redundant disk access. In this paper, we proposed an efficient buffer strategy for a spatiotemporal database index structure, specifically indexing objects moving over a network of roads. The proposed strategy, namely MONPAR, is based on the data type (i.e. spatiotemporal data) and the structure of the index structure. For the purpose of an experimental evaluation, we set up a simulation environment that counts the number of disk accesses while executing a number of spatiotemporal range-queries over the index. We reiterated simulations with query sets with different distributions, such as uniform query distribution and skewed query distribution. Based on the comparison of our strategy with wellknown page-replacement techniques, like LRU-based and Prioritybased buffers, we conclude that MONPAR behaves better than its competitors for small and medium size buffers under all used query-distributions.Keywords: Buffer Management, Spatiotemporal databases.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477479 Defining a Pathway to Zero Energy Building: A Case Study on Retrofitting an Old Office Building into a Net Zero Energy Building for Hot-Humid Climate
Authors: Kwame B. O. Amoah
Abstract:
This paper focuses on retrofitting an old existing office building to a net-zero energy building (NZEB). An existing small office building in Melbourne, Florida, was chosen as a case study to integrate state-of-the-art design strategies and energy-efficient building systems to improve building performance and reduce energy consumption. The study aimed to explore possible ways to maximize energy savings and renewable energy generation sources to cover the building's remaining energy needs necessary to achieve net-zero energy goals. A series of retrofit options were reviewed and adopted with some significant additional decision considerations. Detailed processes and considerations leading to zero energy are well documented in this study, with lessons learned adequately outlined. Based on building energy simulations, multiple design considerations were investigated, such as emerging state-of-the-art technologies, material selection, improvements to the building envelope, optimization of the HVAC, lighting systems, and occupancy loads analysis, as well as the application of renewable energy sources. The comparative analysis of simulation results was used to determine how specific techniques led to energy saving and cost reductions. The research results indicate that this small office building can meet net-zero energy use after appropriate design manipulations and renewable energy sources.
Keywords: Energy consumption, building energy analysis, energy retrofits, energy-efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 342478 SLM Using Riemann Sequence Combined with DCT Transform for PAPR Reduction in OFDM Communication Systems
Authors: Pepin Magnangana Zoko Goyoro, Ibrahim James Moumouni, Sroy Abouty
Abstract:
Orthogonal Frequency Division Multiplexing (OFDM) is an efficient method of data transmission for high speed communication systems. However, the main drawback of OFDM systems is that, it suffers from the problem of high Peak-to-Average Power Ratio (PAPR) which causes inefficient use of the High Power Amplifier and could limit transmission efficiency. OFDM consist of large number of independent subcarriers, as a result of which the amplitude of such a signal can have high peak values. In this paper, we propose an effective reduction scheme that combines DCT and SLM techniques. The scheme is composed of the DCT followed by the SLM using the Riemann matrix to obtain phase sequences for the SLM technique. The simulation results show PAPR can be greatly reduced by applying the proposed scheme. In comparison with OFDM, while OFDM had high values of PAPR –about 10.4dB our proposed method achieved about 4.7dB reduction of the PAPR with low complexities computation. This approach also avoids randomness in phase sequence selection, which makes it simpler to decode at the receiver. As an added benefit, the matrices can be generated at the receiver end to obtain the data signal and hence it is not required to transmit side information (SI).Keywords: DCT transform, OFDM, PAPR, Riemann matrix, SLM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2641477 Mercury Removal Using Pseudomonas putida (ATTC 49128): Effect of Acclimatization Time, Speed and Temperature of Incubator Shaker
Authors: A. A. M. Azoddein, R. M. Yunus, N. M. Sulaiman, A. B. Bustary, K. Sabar
Abstract:
Microbes have been used to solve environmental problems for many years. The role of microorganism to sequester, precipitate or alter the oxidation state of various heavy metals has been extensively studied. Treatment using microorganism interacts with toxic metal are very diverse. The purpose of this research is to remove the mercury using Pseudomonas putida (P. putida), pure culture ATTC 49128 at optimum growth parameters such as techniques of culture, acclimatization time and speed of incubator shaker. Thus, in this study, the optimum growth parameters of P. putida were obtained to achieve the maximum of mercury removal. Based on the optimum parameters of P. putida for specific growth rate, the removal of two different mercury concentration, 1 ppm and 4 ppm were studied. From mercury nitrate solution, a mercuryresistant bacterial strain which is able to reduce from ionic mercury to metallic mercury was used to reduce ionic mercury. The overall levels of mercury removal in this study were between 80% and 89%. The information obtained in this study is of fundamental for understanding of the survival of P. putida ATTC 49128 in mercury solution. Thus, microbial mercury removal is a potential bioremediation for wastewater especially in petrochemical industries in Malaysia.Keywords: Pseudomonas putida, growth kinetic, biosorption, mercury, petrochemical wastewater.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2423476 An Advanced Approach Based on Artificial Neural Networks to Identify Environmental Bacteria
Authors: Mauro Giacomini, Stefania Bertone, Federico Caneva Soumetz, Carmelina Ruggiero
Abstract:
Environmental micro-organisms include a large number of taxa and some species that are generally considered nonpathogenic, but can represent a risk in certain conditions, especially for elderly people and immunocompromised individuals. Chemotaxonomic identification techniques are powerful tools for environmental micro-organisms, and cellular fatty acid methyl esters (FAME) content is a powerful fingerprinting identification technique. A system based on an unsupervised artificial neural network (ANN) was set up using the fatty acid profiles of standard bacterial strains, obtained by gas-chromatography, used as learning data. We analysed 45 certified strains belonging to Acinetobacter, Aeromonas, Alcaligenes, Aquaspirillum, Arthrobacter, Bacillus, Brevundimonas, Enterobacter, Flavobacterium, Micrococcus, Pseudomonas, Serratia, Shewanella and Vibrio genera. A set of 79 bacteria isolated from a drinking water line (AMGA, the major water supply system in Genoa) were used as an example for identification compared to standard MIDI method. The resulting ANN output map was found to be a very powerful tool to identify these fresh isolates.
Keywords: Cellular fatty acid methyl esters, environmental bacteria, gas-chromatography, unsupervised ANN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1840475 Production of Energetic Nanomaterials by Spray Flash Evaporation
Authors: Martin Klaumünzer, Jakob Hübner, Denis Spitzer
Abstract:
Within this paper, latest results on processing of energetic nanomaterials by means of the Spray Flash Evaporation technique are presented. This technology constitutes a highly effective and continuous way to prepare fascinating materials on the nano- and micro-scale. Within the process, a solution is set under high pressure and sprayed into an evacuated atomization chamber. Subsequent ultrafast evaporation of the solvent leads to an aerosol stream, which is separated by cyclones or filters. No drying gas is required, so the present technique should not be confused with spray dying. Resulting nanothermites, insensitive explosives or propellants and compositions are foreseen to replace toxic (according to REACH) and very sensitive matter in military and civil applications. Diverse examples are given in detail: nano-RDX (n-Cyclotrimethylentrinitramin) and nano-aluminum based systems, mixtures (n-RDX/n-TNT - trinitrotoluene) or even cocrystalline matter like n-CL-20/HMX (Hexanitrohexaazaisowurtzitane/ Cyclotetra-methylentetranitramin). These nanomaterials show reduced sensitivity by trend without losing effectiveness and performance. An analytical study for material characterization was performed by using Atomic Force Microscopy, X-Ray Diffraction, and combined techniques as well as spectroscopic methods. As a matter of course, sensitivity tests regarding electrostatic discharge, impact, and friction are provided.
Keywords: Continuous synthesis, energetic material, nanoscale, nanothermite, nanoexplosive.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1440474 Benefits of Construction Management Implications and Processes by Projects Managers on Project Completion
Authors: Mamoon Mousa Atout
Abstract:
Projects managers in construction industry usually face a difficult organizational environment especially if the project is unique. The organization lacks the processes to practice construction management correctly, and the executive’s technical managers who have lack of experience in playing their role and responsibilities correctly. Project managers need to adopt best practices that allow them to do things effectively to make sure that the project can be delivered without any delay even though the executive’s technical managers should follow a certain process to avoid any factor might cause any delay during the project life cycle. The purpose of the paper is to examine the awareness level of projects managers about construction management processes, tools, techniques and implications to complete projects on time. The outcome and the results of the study are prepared based on the designed questionnaires and interviews conducted with many project managers. The method used in this paper is a quantitative study. A survey with a sample of 100 respondents was prepared and distributed in a construction company in Dubai, which includes nine questions to examine the level of their awareness. This research will also identify the necessary benefits of processes of construction management that has to be adopted by projects managers to mitigate the maximum potential problems which might cause any delay to the project life cycle.
Keywords: Construction Methodology, Design Process, Project Managers, Scheduling and Resource Planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2087473 Improvement of Soft Clay Using Floating Cement Dust-Lime Columns
Authors: Adel Belal, Sameh Aboelsoud, Mohy Elmashad, Mohammed Abdelmonem
Abstract:
The two main criteria that control the design and performance of footings are bearing capacity and settlement of soil. In soft soils, the construction of buildings, storage tanks, warehouse, etc. on weak soils usually involves excessive settlement problems. To solve bearing capacity or reduce settlement problems, soil improvement may be considered by using different techniques, including encased cement dust–lime columns. The proposed research studies the effect of adding floating encased cement dust and lime mix columns to soft clay on the clay-bearing capacity. Four experimental tests were carried out. Columns diameters of 3.0 cm, 4.0 cm, and 5.0 cm and columns length of 60% of the clay layer thickness were used. Numerical model was constructed and verified using commercial finite element package (PLAXIS 2D, V8.5). The verified model was used to study the effect of distributing columns around the footing at different distances. The study showed that the floating cement dust lime columns enhanced the clay-bearing capacity with 262%. The numerical model showed that the columns around the footing have a limit effect on the clay improvement.
Keywords: Bearing capacity, cement dust – lime columns, ground improvement, soft clay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1117472 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.
Keywords: Security, internet of things, cloud computing, Stackelberg security game, machine learning, Naïve Q-learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656471 An Efficient Architecture for Interleaved Modular Multiplication
Authors: Ahmad M. Abdel Fattah, Ayman M. Bahaa El-Din, Hossam M.A. Fahmy
Abstract:
Modular multiplication is the basic operation in most public key cryptosystems, such as RSA, DSA, ECC, and DH key exchange. Unfortunately, very large operands (in order of 1024 or 2048 bits) must be used to provide sufficient security strength. The use of such big numbers dramatically slows down the whole cipher system, especially when running on embedded processors. So far, customized hardware accelerators - developed on FPGAs or ASICs - were the best choice for accelerating modular multiplication in embedded environments. On the other hand, many algorithms have been developed to speed up such operations. Examples are the Montgomery modular multiplication and the interleaved modular multiplication algorithms. Combining both customized hardware with an efficient algorithm is expected to provide a much faster cipher system. This paper introduces an enhanced architecture for computing the modular multiplication of two large numbers X and Y modulo a given modulus M. The proposed design is compared with three previous architectures depending on carry save adders and look up tables. Look up tables should be loaded with a set of pre-computed values. Our proposed architecture uses the same carry save addition, but replaces both look up tables and pre-computations with an enhanced version of sign detection techniques. The proposed architecture supports higher frequencies than other architectures. It also has a better overall absolute time for a single operation.Keywords: Montgomery multiplication, modular multiplication, efficient architecture, FPGA, RSA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2456470 Forecast of Polyethylene Properties in the Gas Phase Polymerization Aided by Neural Network
Authors: Nasrin Bakhshizadeh, Ashkan Forootan
Abstract:
A major problem that affects the quality control of polymer in the industrial polymerization is the lack of suitable on-line measurement tools to evaluate the properties of the polymer such as melt and density indices. Controlling the polymerization in ordinary method is performed manually by taking samples, measuring the quality of polymer in the lab and registry of results. This method is highly time consuming and leads to producing large number of incompatible products. An online application for estimating melt index and density proposed in this study is a neural network based on the input-output data of the polyethylene production plant. Temperature, the level of reactors' bed, the intensity of ethylene mass flow, hydrogen and butene-1, the molar concentration of ethylene, hydrogen and butene-1 are used for the process to establish the neural model. The neural network is taught based on the actual operational data and back-propagation and Levenberg-Marquart techniques. The simulated results indicate that the neural network process model established with three layers (one hidden layer) for forecasting the density and the four layers for the melt index is able to successfully predict those quality properties.
Keywords: Polyethylene, polymerization, density, melt index, neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 687469 Providing a Secure, Reliable and Decentralized Document Management Solution Using Blockchain by a Virtual Identity Card
Authors: Meet Shah, Ankita Aditya, Dhruv Bindra, V. S. Omkar, Aashruti Seervi
Abstract:
In today's world, we need documents everywhere for a smooth workflow in the identification process or any other security aspects. The current system and techniques which are used for identification need one thing, that is ‘proof of existence’, which involves valid documents, for example, educational, financial, etc. The main issue with the current identity access management system and digital identification process is that the system is centralized in their network, which makes it inefficient. The paper presents the system which resolves all these cited issues. It is based on ‘blockchain’ technology, which is a 'decentralized system'. It allows transactions in a decentralized and immutable manner. The primary notion of the model is to ‘have everything with nothing’. It involves inter-linking required documents of a person with a single identity card so that a person can go anywhere without having the required documents with him/her. The person just needs to be physically present at a place wherein documents are necessary, and using a fingerprint impression and an iris scan print, the rest of the verification will progress. Furthermore, some technical overheads and advancements are listed. This paper also aims to layout its far-vision scenario of blockchain and its impact on future trends.
Keywords: Blockchain, decentralized system, fingerprint impression, identity management, iris scan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1307468 A Modified Run Length Coding Technique for Test Data Compression Based on Multi-Level Selective Huffman Coding
Authors: C. Kalamani, K. Paramasivam
Abstract:
Test data compression is an efficient method for reducing the test application cost. The problem of reducing test data has been addressed by researchers in three different aspects: Test Data Compression, Built-in-Self-Test (BIST) and Test set compaction. The latter two methods are capable of enhancing fault coverage with cost of hardware overhead. The drawback of the conventional methods is that they are capable of reducing the test storage and test power but when test data have redundant length of runs, no additional compression method is followed. This paper presents a modified Run Length Coding (RLC) technique with Multilevel Selective Huffman Coding (MLSHC) technique to reduce test data volume, test pattern delivery time and power dissipation in scan test applications where redundant length of runs is encountered then the preceding run symbol is replaced with tiny codeword. Experimental results show that the presented method not only improves the test data compression but also reduces the overall test data volume compared to recent schemes. Experiments for the six largest ISCAS-98 benchmarks show that our method outperforms most known techniques.
Keywords: Modified run length coding, multilevel selective Huffman coding, built-in-self-test modified selective Huffman coding, automatic test equipment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274467 Mixed Traffic Speed–Flow Behavior under Influence of Road Side Friction and Non-Motorized Vehicles: A Comparative Study of Arterial Roads in India
Authors: Chetan R. Patel, G. J. Joshi
Abstract:
Present study is carried out on six lane divided urban arterial road in Patna and Pune city of India. Both the road having distinct differences in terms of the vehicle composition and the road side parking. Arterial road in Patan city has 33% of non-motorized mode, whereas Pune arterial road dominated by 65% of Two wheeler. Also road side parking is observed in Patna city. The field studies using videography techniques are carried out for traffic data collection. Data are extracted for one minute duration for vehicle composition, speed variation and flow rate on selected arterial road of the two cities. Speed flow relationship is developed and capacity is determine. Equivalency factor in terms of dynamic car unit is determine to represent the vehicle is single unit. The variation in the capacity due to side friction, presence of non motorized traffic and effective utilization of lane width is compared at concluding remarks.Keywords: Arterial Road, Capacity, Dynamic Equivalency Factor, Effect of Non motorized mode, Side friction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3214466 MPSO based Model Order Formulation Technique for SISO Continuous Systems
Authors: S. N. Deepa, G. Sugumaran
Abstract:
This paper proposes a new version of the Particle Swarm Optimization (PSO) namely, Modified PSO (MPSO) for model order formulation of Single Input Single Output (SISO) linear time invariant continuous systems. In the General PSO, the movement of a particle is governed by three behaviors namely inertia, cognitive and social. The cognitive behavior helps the particle to remember its previous visited best position. In Modified PSO technique split the cognitive behavior into two sections like previous visited best position and also previous visited worst position. This modification helps the particle to search the target very effectively. MPSO approach is proposed to formulate the higher order model. The method based on the minimization of error between the transient responses of original higher order model and the reduced order model pertaining to the unit step input. The results obtained are compared with the earlier techniques utilized, to validate its ease of computation. The proposed method is illustrated through numerical example from literature.Keywords: Continuous System, Model Order Formulation, Modified Particle Swarm Optimization, Single Input Single Output, Transfer Function Approach
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783465 Online Graduate Students’ Perspective on Engagement in Active Learning in the United States
Authors: Ehi E. Aimiuwu
Abstract:
As of 2017, many researchers in educational journals are still wondering if students are effectively and efficiently engaged in active learning in the online learning environment. The goal of this qualitative single case study and narrative research is to explore if students are actively engaged in their online learning. Seven online students in the United States from LinkedIn and residencies were interviewed for this study. Eleven online learning techniques from research were used as a framework. Data collection tools were used for the study that included a digital audiotape, observation sheet, interview protocol, transcription, and NVivo 12 Plus qualitative software. Data analysis process, member checking, and key themes were used to reach saturation. About 85.7% of students preferred individual grading. About 71.4% of students valued professor’s interacting 2-3 times weekly, participating through posts and responses, having good internet access, and using email. Also, about 57.1% said students log in 2-3 times weekly to daily, professor’s social presence helps, regular punctuality in work submission, and prefer assessments style of research, essay, and case study. About 42.9% appreciated syllabus usefulness and professor’s expertise.Keywords: Class facilitation, course management, online teaching, online education, student engagement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 693464 Applying GQM Approach towards Development of Criterion-Referenced Assessment Model for OO Programming Courses
Authors: Norazlina Khamis, Sufian Idris, Rodina Ahmad
Abstract:
The most influential programming paradigm today is object oriented (OO) programming and it is widely used in education and industry. Recognizing the importance of equipping students with OO knowledge and skills, it is not surprising that most Computer Science degree programs offer OO-related courses. How do we assess whether the students have acquired the right objectoriented skills after they have completed their OO courses? What are object oriented skills? Currently none of the current assessment techniques would be able to provide this answer. Traditional forms of OO programming assessment provide a ways for assigning numerical scores to determine letter grades. But this rarely reveals information about how students actually understand OO concept. It appears reasonable that a better understanding of how to define and assess OO skills is needed by developing a criterion referenced model. It is even critical in the context of Malaysia where there is currently a growing concern over the level of competency of Malaysian IT graduates in object oriented programming. This paper discussed the approach used to develop the criterion-referenced assessment model. The model can serve as a guideline when conducting OO programming assessment as mentioned. The proposed model is derived by using Goal Questions Metrics methodology, which helps formulate the metrics of interest. It concluded with a few suggestions for further study.Keywords: Object-oriented programming, programmingassessment, criterion-referenced assessment model, goal questionsmetrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1109463 QoS Improvement Using Intelligent Algorithm under Dynamic Tropical Weather for Earth-Space Satellite Applications
Authors: Joseph S. Ojo, Vincent A. Akpan, Oladayo G. Ajileye, Olalekan L, Ojo
Abstract:
In this paper, the intelligent algorithm (IA) that is capable of adapting to dynamical tropical weather conditions is proposed based on fuzzy logic techniques. The IA effectively interacts with the quality of service (QoS) criteria irrespective of the dynamic tropical weather to achieve improvement in the satellite links. To achieve this, an adaptive network-based fuzzy inference system (ANFIS) has been adopted. The algorithm is capable of interacting with the weather fluctuation to generate appropriate improvement to the satellite QoS for efficient services to the customers. 5-year (2012-2016) rainfall rate of one-minute integration time series data has been used to derive fading based on ITU-R P. 618-12 propagation models. The data are obtained from the measurement undertaken by the Communication Research Group (CRG), Physics Department, Federal University of Technology, Akure, Nigeria. The rain attenuation and signal-to-noise ratio (SNR) were derived for frequency between Ku and V-band and propagation angle with respect to different transmitting power. The simulated results show a substantial reduction in SNR especially for application in the area of digital video broadcast-second generation coding modulation satellite networks.
Keywords: Fuzzy logic, intelligent algorithm, Nigeria, QoS, satellite applications, tropical weather.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 821462 An Approach to Polynomial Curve Comparison in Geometric Object Database
Authors: Chanon Aphirukmatakun, Natasha Dejdumrong
Abstract:
In image processing and visualization, comparing two bitmapped images needs to be compared from their pixels by matching pixel-by-pixel. Consequently, it takes a lot of computational time while the comparison of two vector-based images is significantly faster. Sometimes these raster graphics images can be approximately converted into the vector-based images by various techniques. After conversion, the problem of comparing two raster graphics images can be reduced to the problem of comparing vector graphics images. Hence, the problem of comparing pixel-by-pixel can be reduced to the problem of polynomial comparisons. In computer aided geometric design (CAGD), the vector graphics images are the composition of curves and surfaces. Curves are defined by a sequence of control points and their polynomials. In this paper, the control points will be considerably used to compare curves. The same curves after relocated or rotated are treated to be equivalent while two curves after different scaled are considered to be similar curves. This paper proposed an algorithm for comparing the polynomial curves by using the control points for equivalence and similarity. In addition, the geometric object-oriented database used to keep the curve information has also been defined in XML format for further used in curve comparisons.Keywords: Bezier curve, Said-Ball curve, Wang-Ball curve, DP curve, CAGD, comparison, geometric object database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2221461 Mining Genes Relations in Microarray Data Combined with Ontology in Colon Cancer Automated Diagnosis System
Authors: A. Gruzdz, A. Ihnatowicz, J. Siddiqi, B. Akhgar
Abstract:
MATCH project [1] entitle the development of an automatic diagnosis system that aims to support treatment of colon cancer diseases by discovering mutations that occurs to tumour suppressor genes (TSGs) and contributes to the development of cancerous tumours. The constitution of the system is based on a) colon cancer clinical data and b) biological information that will be derived by data mining techniques from genomic and proteomic sources The core mining module will consist of the popular, well tested hybrid feature extraction methods, and new combined algorithms, designed especially for the project. Elements of rough sets, evolutionary computing, cluster analysis, self-organization maps and association rules will be used to discover the annotations between genes, and their influence on tumours [2]-[11]. The methods used to process the data have to address their high complexity, potential inconsistency and problems of dealing with the missing values. They must integrate all the useful information necessary to solve the expert's question. For this purpose, the system has to learn from data, or be able to interactively specify by a domain specialist, the part of the knowledge structure it needs to answer a given query. The program should also take into account the importance/rank of the particular parts of data it analyses, and adjusts the used algorithms accordingly.Keywords: Bioinformatics, gene expression, ontology, selforganizingmaps.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1975