Search results for: Attack Features
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1805

Search results for: Attack Features

1205 An Angioplasty Intervention Simulator with a Specific Virtual Environment

Authors: G. Aloisio, L. T. De Paolis, A. De Mauro, A. Mongelli

Abstract:

One of the essential requirements of a realistic surgical simulator is to reproduce haptic sensations due to the interactions in the virtual environment. However, the interaction need to be performed in real-time, since a delay between the user action and the system reaction reduces the immersion sensation. In this paper, a prototype of a coronary stent implant simulator is present; this system allows real-time interactions with an artery by means of a specific haptic device. To improve the realism of the simulation, the building of the virtual environment is based on real patients- images and a Web Portal is used to search in the geographically remote medical centres a virtual environment with specific features in terms of pathology or anatomy. The functional architecture of the system defines several Medical Centres in which virtual environments built from the real patients- images and related metadata with specific features in terms of pathology or anatomy are stored. The searched data are downloaded from the Medical Centre to the Training Centre provided with a specific haptic device and with the software necessary both to manage the interaction in the virtual environment. After the integration of the virtual environment in the simulation system it is possible to perform training on the specific surgical procedure.

Keywords: Medical Simulation, Web Portal, Virtual Reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1797
1204 A Goal-Driven Crime Scripting Framework

Authors: Hashem Dehghanniri

Abstract:

Crime scripting is a simple and effective crime modeling technique that aims to improve understanding of security analysts about security and crime incidents. Low-quality scripts provide a wrong, incomplete, or sophisticated understanding of the crime commission process, which oppose the purpose of their application, e.g., identifying effective and cost-efficient situational crime prevention (SCP) measures. One important and overlooked factor in generating quality scripts is the crime scripting method. This study investigates the problems within the existing crime scripting practices and proposes a crime scripting approach that contributes to generating quality crime scripts. It was validated by experienced crime scripters. This framework helps analysts develop better crime scripts and contributes to their effective application, e.g., SCP measures identification or policy-making.

Keywords: Attack modeling, crime commission process, crime script, situational crime prevention.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 707
1203 Review and Evaluation of Trending Canonical Correlation Analyses-Based Brain-Computer Interface Methods

Authors: Bayar Shahab

Abstract:

The fast development of technology that has advanced neuroscience and human interaction with computers has enabled solutions to various problems and issues of this new era. The Brain-Computer Interface (BCI) has opened the door to several new research areas and have been able to provide solutions to critical and vital issues such as supporting a paralyzed patient to interact with the outside world, controlling a robot arm, playing games in VR with the brain, driving a wheelchair. This review presents the state-of-the-art methods and improvements of canonical correlation analyses (CCA), an SSVEP-based BCI method. These are the methods used to extract EEG signal features or, to be said differently, the features of interest that we are looking for in the EEG analyses. Each of the methods from oldest to newest has been discussed while comparing their advantages and disadvantages. This would create a great context and help researchers understand the most state-of-the-art methods available in this field, their pros and cons, and their mathematical representations and usage. This work makes a vital contribution to the existing field of study. It differs from other similar recently published works by providing the following: (1) stating most of the main methods used in this field in a hierarchical way, (2) explaining the pros and cons of each method and their performance, (3) presenting the gaps that exist at the end of each method that can improve the understanding and open doors to new researches or improvements. 

Keywords: BCI, CCA, SSVEP, EEG

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 590
1202 Cryptography Over Elliptic Curve Of The Ring Fq[e], e4 = 0

Authors: Chillali Abdelhakim

Abstract:

Groups where the discrete logarithm problem (DLP) is believed to be intractable have proved to be inestimable building blocks for cryptographic applications. They are at the heart of numerous protocols such as key agreements, public-key cryptosystems, digital signatures, identification schemes, publicly verifiable secret sharings, hash functions and bit commitments. The search for new groups with intractable DLP is therefore of great importance.The goal of this article is to study elliptic curves over the ring Fq[], with Fq a finite field of order q and with the relation n = 0, n ≥ 3. The motivation for this work came from the observation that several practical discrete logarithm-based cryptosystems, such as ElGamal, the Elliptic Curve Cryptosystems . In a first time, we describe these curves defined over a ring. Then, we study the algorithmic properties by proposing effective implementations for representing the elements and the group law. In anther article we study their cryptographic properties, an attack of the elliptic discrete logarithm problem, a new cryptosystem over these curves.

Keywords: Elliptic Curve Over Ring, Discrete Logarithm Problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3583
1201 Influence of Microstructural Features on Wear Resistance of Biomedical Titanium Materials

Authors: Mohsin T. Mohammed, Zahid A. Khan, Arshad N. Siddiquee

Abstract:

The field of biomedical materials plays an imperative requisite and a critical role in manufacturing a variety of biological artificial replacements in a modern world. Recently, titanium (Ti) materials are being used as biomaterials because of their superior corrosion resistance and tremendous specific strength, free- allergic problems and the greatest biocompatibility compared to other competing biomaterials such as stainless steel, Co-Cr alloys, ceramics, polymers, and composite materials. However, regardless of these excellent performance properties, Implantable Ti materials have poor shear strength and wear resistance which limited their applications as biomaterials. Even though the wear properties of Ti alloys has revealed some improvements, the crucial effectiveness of biomedical Ti alloys as wear components requires a comprehensive deep understanding of the wear reasons, mechanisms, and techniques that can be used to improve wear behavior. This review examines current information on the effect of thermal and thermomechanical processing of implantable Ti materials on the long-term prosthetic requirement which related with wear behavior. This paper focuses mainly on the evolution, evaluation and development of effective microstructural features that can improve wear properties of bio grade Ti materials using thermal and thermomechanical treatments.

Keywords: Wear Resistance, Heat Treatment, Thermomechanical Processing, Biomedical Titanium Materials.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3663
1200 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization

Authors: Hironori Karachi, Haruka Yamashita

Abstract:

Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.

Keywords: Data science, non-negative matrix factorization, missing data, quality of services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 453
1199 Pathomorphological Features of Lungs from Brown Hares Infected with Parasites

Authors: Mariana Panayotova-Pencheva, Anetka Trifonova, Vassilena Dakova

Abstract:

790 lungs from brown hares (Lepus europeus L.) from different regions of Bulgaria were investigated during the period 2009-2017. The parasitological status and pathomorphological features in the lungs were recorded. The following parasite species were established: one nematode - Protostrongylus tauricus (7.59% prevalence), one tapeworm – larva of Taenia pisiformis Cysticercus pisiformis (3.04% prevalence) and one arthropod – larva of Linguatula serrata – Pentastomum dentatum (0.89% prevalence). Macroscopic lesions in the lungs were different depending on the causative agents. The infections with C. pisiformis and P. dentatum were attended with small, mainly superficial changes in the lungs. Protostrongylid infections were connected with different in appearance and burden macroscopic changes. In 77.7%, they were nodular, and in the rest of cases, they diffuse. The consistency of the lesions was compact. In most of the cases, alterations were grey in colour, rarely were dark-red or marble-like. In 91.7% of these cases, they were spread on the apical parts of large lung lobes. In 36.7% middle parts of the large lung lobes, and, in 26.7% small lung lobes, were also affected. The small lung lobes were never independently infected.

Keywords: Cysticercus pisiformis, Lepus europeus, lung lesions, Pentastomum dentatum, Protostrongylus tauricus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 795
1198 CompPSA: A Component-Based Pairwise RNA Secondary Structure Alignment Algorithm

Authors: Ghada Badr, Arwa Alturki

Abstract:

The biological function of an RNA molecule depends on its structure. The objective of the alignment is finding the homology between two or more RNA secondary structures. Knowing the common functionalities between two RNA structures allows a better understanding and a discovery of other relationships between them. Besides, identifying non-coding RNAs -that is not translated into a protein- is a popular application in which RNA structural alignment is the first step A few methods for RNA structure-to-structure alignment have been developed. Most of these methods are partial structure-to-structure, sequence-to-structure, or structure-to-sequence alignment. Less attention is given in the literature to the use of efficient RNA structure representation and the structure-to-structure alignment methods are lacking. In this paper, we introduce an O(N2) Component-based Pairwise RNA Structure Alignment (CompPSA) algorithm, where structures are given as a component-based representation and where N is the maximum number of components in the two structures. The proposed algorithm compares the two RNA secondary structures based on their weighted component features rather than on their base-pair details. Extensive experiments are conducted illustrating the efficiency of the CompPSA algorithm when compared to other approaches and on different real and simulated datasets. The CompPSA algorithm shows an accurate similarity measure between components. The algorithm gives the flexibility for the user to align the two RNA structures based on their weighted features (position, full length, and/or stem length). Moreover, the algorithm proves scalability and efficiency in time and memory performance.

Keywords: Alignment, RNA secondary structure, pairwise, component-based, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 974
1197 A Novel Approach for Protein Classification Using Fourier Transform

Authors: A. F. Ali, D. M. Shawky

Abstract:

Discovering new biological knowledge from the highthroughput biological data is a major challenge to bioinformatics today. To address this challenge, we developed a new approach for protein classification. Proteins that are evolutionarily- and thereby functionally- related are said to belong to the same classification. Identifying protein classification is of fundamental importance to document the diversity of the known protein universe. It also provides a means to determine the functional roles of newly discovered protein sequences. Our goal is to predict the functional classification of novel protein sequences based on a set of features extracted from each protein sequence. The proposed technique used datasets extracted from the Structural Classification of Proteins (SCOP) database. A set of spectral domain features based on Fast Fourier Transform (FFT) is used. The proposed classifier uses multilayer back propagation (MLBP) neural network for protein classification. The maximum classification accuracy is about 91% when applying the classifier to the full four levels of the SCOP database. However, it reaches a maximum of 96% when limiting the classification to the family level. The classification results reveal that spectral domain contains information that can be used for classification with high accuracy. In addition, the results emphasize that sequence similarity measures are of great importance especially at the family level.

Keywords: Bioinformatics, Artificial Neural Networks, Protein Sequence Analysis, Feature Extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2360
1196 Effects of Livestream Affordances on Consumer Purchase Willingness: Explicit IT Affordances Perspective

Authors: Isaac O. Asante, Yushi Jiang, Hailin Tao

Abstract:

Livestreaming marketing, the new electronic commerce element, has become an optional marketing channel following the COVID-19 pandemic, and many sellers are leveraging the features presented by livestreaming to increase sales. This study was conducted to measure real-time observable interactions between consumers and sellers. Based on the affordance theory, this study conceptualized constructs representing the interactive features and examined how they drive consumers’ purchase willingness during livestreaming sessions using 1238 datasets from Amazon Live, following the manual observation of transaction records. Using structural equation modeling, the ordinary least square regression suggests that live viewers, new followers, live chats, and likes positively affect purchase willingness. The Sobel and Monte Carlo tests show that new followers, live chats, and likes significantly mediate the relationship between live viewers and purchase willingness. The study presents a way of measuring interactions in livestreaming commerce and proposes a way to manually gather data on consumer behaviors in livestreaming platforms when the application programming interface (API) of such platforms does not support data mining algorithms.

Keywords: Livestreaming marketing, live chats, live viewers, likes, new followers, purchase willingness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 149
1195 A POX Controller Module to Prepare a List of Flow Header Information Extracted from SDN Traffic

Authors: Wisam H. Muragaa, Kamaruzzaman Seman, Mohd Fadzli Marhusin

Abstract:

Software Defined Networking (SDN) is a paradigm designed to facilitate the way of controlling the network dynamically and with more agility. Network traffic is a set of flows, each of which contains a set of packets. In SDN, a matching process is performed on every packet coming to the network in the SDN switch. Only the headers of the new packets will be forwarded to the SDN controller. In terminology, the flow header fields are called tuples. Basically, these tuples are 5-tuple: the source and destination IP addresses, source and destination ports, and protocol number. This flow information is used to provide an overview of the network traffic. Our module is meant to extract this 5-tuple with the packets and flows numbers and show them as a list. Therefore, this list can be used as a first step in the way of detecting the DDoS attack. Thus, this module can be considered as the beginning stage of any flow-based DDoS detection method.

Keywords: Matching, OpenFlow tables, POX controller, SDN, table-miss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1224
1194 Performance of On-site Earthquake Early Warning Systems for Different Sensor Locations

Authors: Ting-Yu Hsu, Shyu-Yu Wu, Shieh-Kung Huang, Hung-Wei Chiang, Kung-Chun Lu, Pei-Yang Lin, Kuo-Liang Wen

Abstract:

Regional earthquake early warning (EEW) systems are not suitable for Taiwan, as most destructive seismic hazards arise due to in-land earthquakes. These likely cause the lead-time provided by regional EEW systems before a destructive earthquake wave arrives to become null. On the other hand, an on-site EEW system can provide more lead-time at a region closer to an epicenter, since only seismic information of the target site is required. Instead of leveraging the information of several stations, the on-site system extracts some P-wave features from the first few seconds of vertical ground acceleration of a single station and performs a prediction of the oncoming earthquake intensity at the same station according to these features. Since seismometers could be triggered by non-earthquake events such as a passing of a truck or other human activities, to reduce the likelihood of false alarms, a seismometer was installed at three different locations on the same site and the performance of the EEW system for these three sensor locations were discussed. The results show that the location on the ground of the first floor of a school building maybe a good choice, since the false alarms could be reduced and the cost for installation and maintenance is the lowest.

Keywords: Earthquake early warning, Single station approach, Seismometer location.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360
1193 Traceable Watermarking System using SoC for Digital Cinema Delivery

Authors: Sadi Vural, Hiromi Tomii, Hironori Yamauchi

Abstract:

As the development of digital technology is increasing, Digital cinema is getting more spread. However, content copy and attack against the digital cinema becomes a serious problem. To solve the above security problem, we propose “Additional Watermarking" for digital cinema delivery system. With this proposed “Additional watermarking" method, we protect content copyrights at encoder and user side information at decoder. It realizes the traceability of the watermark embedded at encoder. The watermark is embedded into the random-selected frames using Hash function. Using it, the embedding position is distributed by Hash Function so that third parties do not break off the watermarking algorithm. Finally, our experimental results show that proposed method is much better than the convenient watermarking techniques in terms of robustness, image quality and its simple but unbreakable algorithm.

Keywords: Decoder, Digital content, JPEG2000 Frame, System-On-Chip and additional watermark.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1686
1192 Fast Wavelet Image Denoising Based on Local Variance and Edge Analysis

Authors: Gaoyong Luo

Abstract:

The approach based on the wavelet transform has been widely used for image denoising due to its multi-resolution nature, its ability to produce high levels of noise reduction and the low level of distortion introduced. However, by removing noise, high frequency components belonging to edges are also removed, which leads to blurring the signal features. This paper proposes a new method of image noise reduction based on local variance and edge analysis. The analysis is performed by dividing an image into 32 x 32 pixel blocks, and transforming the data into wavelet domain. Fast lifting wavelet spatial-frequency decomposition and reconstruction is developed with the advantages of being computationally efficient and boundary effects minimized. The adaptive thresholding by local variance estimation and edge strength measurement can effectively reduce image noise while preserve the features of the original image corresponding to the boundaries of the objects. Experimental results demonstrate that the method performs well for images contaminated by natural and artificial noise, and is suitable to be adapted for different class of images and type of noises. The proposed algorithm provides a potential solution with parallel computation for real time or embedded system application.

Keywords: Edge strength, Fast lifting wavelet, Image denoising, Local variance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028
1191 DWT Based Robust Watermarking Embed Using CRC-32 Techniques

Authors: Sadi Vural, Hiromi Tomii, Hironori Yamauchi

Abstract:

As far as the latest technological improvements are concerned, digital systems more become popular than the past. Despite this growing demand to the digital systems, content copy and attack against the digital cinema contents becomes a serious problem. To solve the above security problem, we propose “traceable watermarking using Hash functions for digital cinema system. Digital Cinema is a great application for traceable watermarking since it uses watermarking technology during content play as well as content transmission. The watermark is embedded into the randomly selected movie frames using CRC-32 techniques. CRC-32 is a Hash function. Using it, the embedding position is distributed by Hash Function so that any party cannot break off the watermarking or will not be able to change. Finally, our experimental results show that proposed DWT watermarking method using CRC-32 is much better than the convenient watermarking techniques in terms of robustness, image quality and its simple but unbreakable algorithm.

Keywords: Decoder, Digital content, JPEG2000 Frame, System-On-Chip, traceable watermark, Hash Function, CRC-32.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1965
1190 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning

Authors: Kaushik Sathupadi, Sandesh Achar

Abstract:

Human action recognition (HAR) modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view Football datasets. Our HAR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH Multi-view Football datasets, respectively.

Keywords: Computer vision, human motion analysis, random forest, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36
1189 Abnormal IP Packets on 3G Mobile Data Networks

Authors: Joo-Hyung Oh, Dongwan Kang, JunHyung Cho, Chaetae Im

Abstract:

As the mobile Internet has become widespread in recent years, communication based on mobile networks is increasing. As a result, security threats have been posed with regard to the abnormal traffic of mobile networks, but mobile security has been handled with focus on threats posed by mobile malicious codes, and researches on security threats to the mobile network itself have not attracted much attention. In mobile networks, the IP address of the data packet is a very important factor for billing purposes. If one mobile terminal use an incorrect IP address that either does not exist or could be assigned to another mobile terminal, billing policy will cause problems. We monitor and analyze 3G mobile data networks traffics for a period of time and finds some abnormal IP packets. In this paper, we analyze the reason for abnormal IP packets on 3G Mobile Data Networks. And we also propose an algorithm based on IP address table that contains addresses currently in use within the mobile data network to detect abnormal IP packets.

Keywords: WCDMA, 3G, Abnormal IP address, Mobile Data Network Attack

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2337
1188 Effect of Runup over a Vertical Pile Supported Caisson Breakwater and Quarter Circle Pile Supported Caisson Breakwater

Authors: T. J. Jemi Jeya, V. Sriram

Abstract:

Pile Supported Caisson breakwater is an ecofriendly breakwater very useful in coastal zone protection. The model is developed by considering the advantages of both caisson breakwater and pile supported breakwater, where the top portion is a vertical or quarter circle caisson and the bottom portion consists of a pile supported breakwater defined as Vertical Pile Supported Breakwater (VPSCB) and Quarter-circle Pile Supported Breakwater (QPSCB). The study mainly focuses on comparison of run up over VPSCB and QPSCB under oblique waves. The experiments are carried out in a shallow wave basin under different water depths (d = 0.5 m & 0.55 m) and under different oblique regular waves (00, 150, 300). The run up over the surface is measured by placing two run up probes over the surface at 0.3 m on both sides from the centre of the model. The results show that the non-dimensional shoreward run up shows slight decrease with respect to increase in angle of wave attack.

Keywords: Caisson breakwater, pile supported breakwater, quarter circle breakwater, vertical breakwater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 690
1187 Texture Feature-Based Language Identification Using Wavelet-Domain BDIP and BVLC Features and FFT Feature

Authors: Ick Hoon Jang, Hoon Jae Lee, Dae Hoon Kwon, Ui Young Pak

Abstract:

In this paper, we propose a texture feature-based language identification using wavelet-domain BDIP (block difference of inverse probabilities) and BVLC (block variance of local correlation coefficients) features and FFT (fast Fourier transform) feature. In the proposed method, wavelet subbands are first obtained by wavelet transform from a test image and denoised by Donoho-s soft-thresholding. BDIP and BVLC operators are next applied to the wavelet subbands. FFT blocks are also obtained by 2D (twodimensional) FFT from the blocks into which the test image is partitioned. Some significant FFT coefficients in each block are selected and magnitude operator is applied to them. Moments for each subband of BDIP and BVLC and for each magnitude of significant FFT coefficients are then computed and fused into a feature vector. In classification, a stabilized Bayesian classifier, which adopts variance thresholding, searches the training feature vector most similar to the test feature vector. Experimental results show that the proposed method with the three operations yields excellent language identification even with rather low feature dimension.

Keywords: BDIP, BVLC, FFT, language identification, texture feature, wavelet transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2149
1186 Investigation of the Effect of Cavitator Angle and Dimensions for a Supercavitating Vehicle

Authors: Sri Raman A., A.K.Ghosh

Abstract:

At very high speeds, bubbles form in the underwater vehicles because of sharp trailing edges or of places where the local pressure is lower than the vapor pressure. These bubbles are called cavities and the size of the cavities grows as the velocity increases. A properly designed cavitator can induce the formation of a single big cavity all over the vehicle. Such a vehicle travelling in the vaporous cavity is called a supercavitating vehicle and the present research work mainly focuses on the dynamic modeling of such vehicles. Cavitation of the fins is also accounted and the effect of the same on trajectory is well explained. The entire dynamics has been developed using the state space approach and emphasis is given on the effect of size and angle of attack of the cavitator. Control law has been established for the motion of the vehicle using Non-linear Dynamic Inverse (NDI) with cavitator as the control surface.

Keywords: High speed underwater vehicle, Non-Linear Dynamic Inverse (NDI), six-dof modeling, Supercavitation, Torpedo.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 71586
1185 Evaluating Complexity – Ethical Challenges in Computational Design Processes

Authors: J.Partanen

Abstract:

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

Keywords: urban planning, architecture, dynamic modeling, ethics, complexity theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890
1184 Performance of Bridge Girder with Perforations under Tsunami Wave Loading

Authors: Sadia Rahman, Shatirah Akib, M. T. R. Khan, R. Triatmadja

Abstract:

Tsunami disaster poses a great threat to coastal infrastructures. Bridges without adequate provisions for earthquake and tsunami loading is generally vulnerable to tsunami attack. During the last two disastrous tsunami event (i.e. Indian Ocean and Japan Tsunami) a number of bridges were observed subsequent damages by tsunami waves. In this study, laboratory experiments were conducted to study the effects of perforations in bridge girder in force reduction. Results showed that significant amount of forces were reduced using perforations in girder. Approximately 10% to 18% force reductions were achieved by using about 16% perforations in bridge girder. Subsequent amount of force reductions revealed that perforations in girder are effective in reducing tsunami forces as perforations in girder let water to be passed through. Thus, less bridge damages are expected with the presence of perforations in girder during tsunami period.

Keywords: Bridge, force, girder, perforation, tsunami, wave.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2317
1183 Exploring the Physical Environment and Building Features in Earthquake Disaster Areas

Authors: Chang Hsueh-Sheng, Chen Tzu-Ling

Abstract:

Earthquake is an unpredictable natural disaster and intensive earthquakes have caused serious impacts on social-economic system, environmental and social resilience. Conventional ways to mitigate earthquake disaster are to enhance building codes and advance structural engineering measures. However, earthquake-induced ground damage such as liquefaction, land subsidence, landslide happen on places nearby earthquake prone or poor soil condition areas. Therefore, this study uses spatial statistical analysis to explore the spatial pattern of damaged buildings. Afterwards, principle components analysis (PCA) is applied to categorize the similar features in different kinds of clustered patterns. The results show that serious landslide prone area, close to fault, vegetated ground surface and mudslide prone area are common in those highly damaged buildings. In addition, the oldest building might not be directly referred to the most vulnerable one. In fact, it seems that buildings built between 1974 and 1989 become more fragile during the earthquake. The incorporation of both spatial statistical analyses and PCA can provide more accurate information to subsidize retrofit programs to enhance earthquake resistance in particular areas.

Keywords: Earthquake disaster, spatial statistical analysis, principle components analysis, clustered patterns.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1384
1182 Security in Resource Constraints Network Light Weight Encryption for Z-MAC

Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy

Abstract:

Wireless sensor network was formed by a combination of nodes, systematically it transmitting the data to their base stations, this transmission data can be easily compromised if the limited processing power and the data consistency from these nodes are kept in mind; there is always a discussion to address the secure data transfer or transmission in actual time. This will present a mechanism to securely transmit the data over a chain of sensor nodes without compromising the throughput of the network by utilizing available battery resources available in the sensor node. Our methodology takes many different advantages of Z-MAC protocol for its efficiency, and it provides a unique key by sharing the mechanism using neighbor node MAC address. We present a light weighted data integrity layer which is embedded in the Z-MAC protocol to prove that our protocol performs well than Z-MAC when we introduce the different attack scenarios.

Keywords: Hybrid MAC protocol, data integrity, lightweight encryption, Neighbor based key sharing, Sensor node data processing, Z-MAC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 565
1181 A Study on Energy-efficient Temperature Control

Authors: Mitsuyuki Kawakami, Kimihiro Yamanaka

Abstract:

The top-heavy demographic of low birth-rate and longer lifespan is a growing social problem, and one of its expected effects will be a shortage of young workers and a growing reliance on a workforce of middle-aged and older people. However, the environment of today's industrial workplace is not particularly suited to middle-aged and older workers, one notable problem being temperature control. Higher temperatures can cause health problems such as heat stroke, and the number of cases increases sharply in people over 65. Moreover, in conditions above 33°C, older people can develop circulatory system disorders, and also have a higher chance of suffering a fatal heart attack. We therefore propose a new method for controlling temperature in the indoor workplace. In this study two different verification experiments were conducted, with the proposed temperature control method being tested in cargo containers and conventional houses. The method's effectiveness was apparent in measurements of temperature and electricity consumption

Keywords: CO2 reduction, Energy saving, Temperature control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1406
1180 Application of Artificial Neural Network for the Prediction of Pressure Distribution of a Plunging Airfoil

Authors: F. Rasi Maezabadi, M. Masdari, M. R. Soltani

Abstract:

Series of experimental tests were conducted on a section of a 660 kW wind turbine blade to measure the pressure distribution of this model oscillating in plunging motion. In order to minimize the amount of data required to predict aerodynamic loads of the airfoil, a General Regression Neural Network, GRNN, was trained using the measured experimental data. The network once proved to be accurate enough, was used to predict the flow behavior of the airfoil for the desired conditions. Results showed that with using a few of the acquired data, the trained neural network was able to predict accurate results with minimal errors when compared with the corresponding measured values. Therefore with employing this trained network the aerodynamic coefficients of the plunging airfoil, are predicted accurately at different oscillation frequencies, amplitudes, and angles of attack; hence reducing the cost of tests while achieving acceptable accuracy.

Keywords: Airfoil, experimental, GRNN, Neural Network, Plunging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656
1179 Dynamic Clustering Estimation of Tool Flank Wear in Turning Process using SVD Models of the Emitted Sound Signals

Authors: A. Samraj, S. Sayeed, J. E. Raja., J. Hossen, A. Rahman

Abstract:

Monitoring the tool flank wear without affecting the throughput is considered as the prudent method in production technology. The examination has to be done without affecting the machining process. In this paper we proposed a novel work that is used to determine tool flank wear by observing the sound signals emitted during the turning process. The work-piece material we used here is steel and aluminum and the cutting insert was carbide material. Two different cutting speeds were used in this work. The feed rate and the cutting depth were constant whereas the flank wear was a variable. The emitted sound signal of a fresh tool (0 mm flank wear) a slightly worn tool (0.2 -0.25 mm flank wear) and a severely worn tool (0.4mm and above flank wear) during turning process were recorded separately using a high sensitive microphone. Analysis using Singular Value Decomposition was done on these sound signals to extract the feature sound components. Observation of the results showed that an increase in tool flank wear correlates with an increase in the values of SVD features produced out of the sound signals for both the materials. Hence it can be concluded that wear monitoring of tool flank during turning process using SVD features with the Fuzzy C means classification on the emitted sound signal is a potential and relatively simple method.

Keywords: Fuzzy c means, Microphone, Singular ValueDecomposition, Tool Flank Wear.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898
1178 A Novel Framework for Abnormal Behaviour Identification and Detection for Wireless Sensor Networks

Authors: Muhammad R. Ahmed, Xu Huang, Dharmendra Sharma

Abstract:

Despite extensive study on wireless sensor network security, defending internal attacks and finding abnormal behaviour of the sensor are still difficult and unsolved task. The conventional cryptographic technique does not give the robust security or detection process to save the network from internal attacker that cause by abnormal behavior. The insider attacker or abnormally behaved sensor identificationand location detection framework using false massage detection and Time difference of Arrival (TDoA) is presented in this paper. It has been shown that the new framework can efficiently identify and detect the insider attacker location so that the attacker can be reprogrammed or subside from the network to save from internal attack.

Keywords: Insider Attaker identification, Abnormal Behaviour, Location detection, Time difference of Arrival (TDoA), Wireless sensor network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1774
1177 Study on the Effect of Road Infrastructure, Socio-Economic and Demographic Features on Road Crashes in Bangladesh

Authors: Shakil M. Rifaat, Md. H. Rahman, Mohammed, Mosabbir Pasha

Abstract:

Road crashes not only claim lives and inflict injuries but also create economic burden to the society due to loss of productivity. The problem of deaths and injuries as a result of road traffic crashes is now acknowledged to be a global phenomenon with authorities in virtually all countries of the world concerned about the growth in the number of people killed and seriously injured on their roads. However, the road crash scenario of a developing country like Bangladesh is much worse comparing with this of developed countries. For developing proper countermeasures it is necessary to identify the factors affecting crash occurrences. The objectives of the study is to examine the effect of district wise road infrastructure, socioeconomic and demographic features on crash occurrence .The unit of analysis will be taken as individual district which has not been explored much in the past. Reported crash data obtained from Bangladesh Road Transport Authority (BRTA) from the year 2004 to 2010 are utilized to develop negative binomial model. The model result will reveal the effect of road length (both paved and unpaved), road infrastructure and several socio economic characteristics on district level crash frequency in Bangladesh.

Keywords: Demographic, Negative Binomial Model, Road Infrastructure, Socio-economic, Traffic Safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3372
1176 Predictive Analytics of Student Performance Determinants in Education

Authors: Mahtab Davari, Charles Edward Okon, Somayeh Aghanavesi

Abstract:

Every institute of learning is usually interested in the performance of enrolled students. The level of these performances determines the approach an institute of study may adopt in rendering academic services. The focus of this paper is to evaluate students' academic performance in given courses of study using machine learning methods. This study evaluated various supervised machine learning classification algorithms such as Logistic Regression (LR), Support Vector Machine (SVM), Random Forest, Decision Tree, K-Nearest Neighbors, Linear Discriminant Analysis (LDA), and Quadratic Discriminant Analysis, using selected features to predict study performance. The accuracy, precision, recall, and F1 score obtained from a 5-Fold Cross-Validation were used to determine the best classification algorithm to predict students’ performances. SVM (using a linear kernel), LDA, and LR were identified as the best-performing machine learning methods. Also, using the LR model, this study identified students' educational habits such as reading and paying attention in class as strong determinants for a student to have an above-average performance. Other important features include the academic history of the student and work. Demographic factors such as age, gender, high school graduation, etc., had no significant effect on a student's performance.

Keywords: Student performance, supervised machine learning, prediction, classification, cross-validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 548