Search results for: Cluster Computing
131 Automatic Map Simplification for Visualization on Mobile Devices
Authors: Hang Yu
Abstract:
The visualization of geographic information on mobile devices has become popular as the widespread use of mobile Internet. The mobility of these devices brings about much convenience to people-s life. By the add-on location-based services of the devices, people can have an access to timely information relevant to their tasks. However, visual analysis of geographic data on mobile devices presents several challenges due to the small display and restricted computing resources. These limitations on the screen size and resources may impair the usability aspects of the visualization applications. In this paper, a variable-scale visualization method is proposed to handle the challenge of small mobile display. By merging multiple scales of information into a single image, the viewer is able to focus on the interesting region, while having a good grasp of the surrounding context. This is essentially visualizing the map through a fisheye lens. However, the fisheye lens induces undesirable geometric distortion in the peripheral, which renders the information meaningless. The proposed solution is to apply map generalization that removes excessive information around the peripheral and an automatic smoothing process to correct the distortion while keeping the local topology consistent. The proposed method is applied on both artificial and real geographical data for evaluation.
Keywords: Map simplification, visualization, mobile devices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1436130 Sequence Relationships Similarity of Swine Influenza a (H1N1) Virus
Authors: Patsaraporn Somboonsak, Mud-Armeen Munlin
Abstract:
In April 2009, a new variant of Influenza A virus subtype H1N1 emerged in Mexico and spread all over the world. The influenza has three subtypes in human (H1N1, H1N2 and H3N2) Types B and C influenza tend to be associated with local or regional epidemics. Preliminary genetic characterization of the influenza viruses has identified them as swine influenza A (H1N1) viruses. Nucleotide sequence analysis of the Haemagglutinin (HA) and Neuraminidase (NA) are similar to each other and the majority of their genes of swine influenza viruses, two genes coding for the neuraminidase (NA) and matrix (M) proteins are similar to corresponding genes of swine influenza. Sequence similarity between the 2009 A (H1N1) virus and its nearest relatives indicates that its gene segments have been circulating undetected for an extended period. Nucleic acid sequence Maximum Likelihood (MCL) and DNA Empirical base frequencies, Phylogenetic relationship amongst the HA genes of H1N1 virus isolated in Genbank having high nucleotide sequence homology. In this paper we used 16 HA nucleotide sequences from NCBI for computing sequence relationships similarity of swine influenza A virus using the following method MCL the result is 28%, 36.64% for Optimal tree with the sum of branch length, 35.62% for Interior branch phylogeny Neighber – Join Tree, 1.85% for the overall transition/transversion, and 8.28% for Overall mean distance.Keywords: Sequence DNA, Relationship of swine, Swineinfluenza, Sequence Similarity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2124129 Searchable Encryption in Cloud Storage
Authors: Ren-Junn Hwang, Chung-Chien Lu, Jain-Shing Wu
Abstract:
Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying k-nearest neighbor technology. The protocol ranks the relevance scores of encrypted files and keywords, and prevents cloud servers from learning search keywords submitted by a cloud user. To reduce the costs of file transfer communication, the cloud server returns encrypted files in order of relevance. Moreover, when a cloud user inputs an incorrect keyword and the number of wrong alphabet does not exceed a given threshold; the user still can retrieve the target files from cloud server. In addition, the proposed scheme satisfies security requirements for outsourced data storage.
Keywords: Fault-tolerance search, multi-keywords search, outsource storage, ranked search, searchable encryption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3081128 Intelligent Mobile Search Oriented to Global e-Commerce
Authors: Abdelkader Dekdouk
Abstract:
In this paper we propose a novel approach for searching eCommerce products using a mobile phone, illustrated by a prototype eCoMobile. This approach aims to globalize the mobile search by integrating the concept of user multilinguism into it. To show that, we particularly deal with English and Arabic languages. Indeed the mobile user can formulate his query on a commercial product in either language (English/Arabic). The description of his information need on commercial products relies on the ontology that represents the conceptualization of the product catalogue knowledge domain defined in both English and Arabic languages. A query expressed on a mobile device client defines the concept that corresponds to the name of the product followed by a set of pairs (property, value) specifying the characteristics of the product. Once a query is submitted it is then communicated to the server side which analyses it and in its turn performs an http request to an eCommerce application server (like Amazon). This latter responds by returning an XML file representing a set of elements where each element defines an item of the searched product with its specific characteristics. The XML file is analyzed on the server side and then items are displayed on the mobile device client along with its relevant characteristics in the chosen language.Keywords: Mobile computing, search engine, multilingualglobal eCommerce, ontology, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2099127 Web Page Watermarking: XML files using Synonyms and Acronyms
Authors: Nighat Mir, Sayed Afaq Hussain
Abstract:
Advent enhancements in the field of computing have increased massive use of web based electronic documents. Current Copyright protection laws are inadequate to prove the ownership for electronic documents and do not provide strong features against copying and manipulating information from the web. This has opened many channels for securing information and significant evolutions have been made in the area of information security. Digital Watermarking has developed into a very dynamic area of research and has addressed challenging issues for digital content. Watermarking can be visible (logos or signatures) and invisible (encoding and decoding). Many visible watermarking techniques have been studied for text documents but there are very few for web based text. XML files are used to trade information on the internet and contain important information. In this paper, two invisible watermarking techniques using Synonyms and Acronyms are proposed for XML files to prove the intellectual ownership and to achieve the security. Analysis is made for different attacks and amount of capacity to be embedded in the XML file is also noticed. A comparative analysis for capacity is also made for both methods. The system has been implemented using C# language and all tests are made practically to get the results.Keywords: Watermarking, Extensible Markup Language (XML), Synonyms, Acronyms, Copyright protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2282126 An Images Monitoring System based on Multi-Format Streaming Grid Architecture
Authors: Yi-Haur Shiau, Sun-In Lin, Shi-Wei Lo, Hsiu-Mei Chou, Yi-Hsuan Chen
Abstract:
This paper proposes a novel multi-format stream grid architecture for real-time image monitoring system. The system, based on a three-tier architecture, includes stream receiving unit, stream processor unit, and presentation unit. It is a distributed computing and a loose coupling architecture. The benefit is the amount of required servers can be adjusted depending on the loading of the image monitoring system. The stream receive unit supports multi capture source devices and multi-format stream compress encoder. Stream processor unit includes three modules; they are stream clipping module, image processing module and image management module. Presentation unit can display image data on several different platforms. We verified the proposed grid architecture with an actual test of image monitoring. We used a fast image matching method with the adjustable parameters for different monitoring situations. Background subtraction method is also implemented in the system. Experimental results showed that the proposed architecture is robust, adaptive, and powerful in the image monitoring system.Keywords: Motion detection, grid architecture, image monitoring system, and background subtraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1591125 Design of Parity-Preserving Reversible Logic Signed Array Multipliers
Authors: Mojtaba Valinataj
Abstract:
Reversible logic as a new favorable design domain can be used for various fields especially creating quantum computers because of its speed and intangible power consumption. However, its susceptibility to a variety of environmental effects may lead to yield the incorrect results. In this paper, because of the importance of multiplication operation in various computing systems, some novel reversible logic array multipliers are proposed with error detection capability by incorporating the parity-preserving gates. The new designs are presented for two main parts of array multipliers, partial product generation and multi-operand addition, by exploiting the new arrangements of existing gates, which results in two signed parity-preserving array multipliers. The experimental results reveal that the best proposed 4×4 multiplier in this paper reaches 12%, 24%, and 26% enhancements in the number of constant inputs, number of required gates, and quantum cost, respectively, compared to previous design. Moreover, the best proposed design is generalized for n×n multipliers with general formulations to estimate the main reversible logic criteria as the functions of the multiplier size.Keywords: Array multipliers, Baugh-Wooley method, error detection, parity-preserving gates, quantum computers, reversible logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1026124 Educational Data Mining: The Case of Department of Mathematics and Computing in the Period 2009-2018
Authors: M. Sitoe, O. Zacarias
Abstract:
University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.
Keywords: Evasion and retention, cross validation, bagging, stacking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 121123 A Comparative Study of Additive and Nonparametric Regression Estimators and Variable Selection Procedures
Authors: Adriano Z. Zambom, Preethi Ravikumar
Abstract:
One of the biggest challenges in nonparametric regression is the curse of dimensionality. Additive models are known to overcome this problem by estimating only the individual additive effects of each covariate. However, if the model is misspecified, the accuracy of the estimator compared to the fully nonparametric one is unknown. In this work the efficiency of completely nonparametric regression estimators such as the Loess is compared to the estimators that assume additivity in several situations, including additive and non-additive regression scenarios. The comparison is done by computing the oracle mean square error of the estimators with regards to the true nonparametric regression function. Then, a backward elimination selection procedure based on the Akaike Information Criteria is proposed, which is computed from either the additive or the nonparametric model. Simulations show that if the additive model is misspecified, the percentage of time it fails to select important variables can be higher than that of the fully nonparametric approach. A dimension reduction step is included when nonparametric estimator cannot be computed due to the curse of dimensionality. Finally, the Boston housing dataset is analyzed using the proposed backward elimination procedure and the selected variables are identified.Keywords: Additive models, local polynomial regression, residuals, mean square error, variable selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1011122 A Holistic Framework for Unifying Data Security and Management in Modern Enterprises
Authors: Ashly Joseph
Abstract:
Modern businesses struggle significantly to secure and manage their data properly as the volume and complexity of their data both expand exponentially. Through the use of a multi-layered defense strategy, a centralized management platform, and cutting-edge technologies like AI, this research paper presents a comprehensive framework to integrate data security and management. The constraints of current data protection and management strategies, technological advancements, and the evolving threat landscape are all examined in this article. It suggests best practices for putting into practice integrated data security and governance models, placing an emphasis on ongoing adaptation. The advantages mentioned include a strengthened security posture, simpler procedures, lower costs, and reduced complexity. Additionally, issues including skill shortages, antiquated systems, and cultural obstacles are examined. Security executives and Chief Information Security Officers are given practical advice on how to evaluate, plan, and put into place strong data-centric security and management capabilities. The goal of the paper is to provide a thorough study of the data security and management landscape and to arm contemporary businesses with the knowledge they need to be proactive in protecting their data assets.
Keywords: Data security, security management, cloud computing, cybersecurity, data governance, security architecture, data management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 272121 Orchestra/Percussion Classification Algorithm for United Speech Audio Coding System
Authors: Yueming Wang, Rendong Ying, Sumxin Jiang, Peilin Liu
Abstract:
Unified Speech Audio Coding (USAC), the latest MPEG standardization for unified speech and audio coding, uses a speech/audio classification algorithm to distinguish speech and audio segments of the input signal. The quality of the recovered audio can be increased by well-designed orchestra/percussion classification and subsequent processing. However, owing to the shortcoming of the system, introducing an orchestra/percussion classification and modifying subsequent processing can enormously increase the quality of the recovered audio. This paper proposes an orchestra/percussion classification algorithm for the USAC system which only extracts 3 scales of Mel-Frequency Cepstral Coefficients (MFCCs) rather than traditional 13 scales of MFCCs and use Iterative Dichotomiser 3 (ID3) Decision Tree rather than other complex learning method, thus the proposed algorithm has lower computing complexity than most existing algorithms. Considering that frequent changing of attributes may lead to quality loss of the recovered audio signal, this paper also design a modified subsequent process to help the whole classification system reach an accurate rate as high as 97% which is comparable to classical 99%.
Keywords: ID3 Decision Tree, MFCC, Orchestra/Percussion Classification, USAC
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673120 An Automatic Pipeline Monitoring System Based on PCA and SVM
Abstract:
This paper proposes a novel system for monitoring the health of underground pipelines. Some of these pipelines transport dangerous contents and any damage incurred might have catastrophic consequences. However, most of these damage are unintentional and usually a result of surrounding construction activities. In order to prevent these potential damages, monitoring systems are indispensable. This paper focuses on acoustically recognizing road cutters since they prelude most construction activities in modern cities. Acoustic recognition can be easily achieved by installing a distributed computing sensor network along the pipelines and using smart sensors to “listen" for potential threat; if there is a real threat, raise some form of alarm. For efficient pipeline monitoring, a novel monitoring approach is proposed. Principal Component Analysis (PCA) was studied and applied. Eigenvalues were regarded as the special signature that could characterize a sound sample, and were thus used for the feature vector for sound recognition. The denoising ability of PCA could make it robust to noise interference. One class SVM was used for classifier. On-site experiment results show that the proposed PCA and SVM based acoustic recognition system will be very effective with a low tendency for raising false alarms.Keywords: One class SVM, pipeline monitoring system, principal component analysis, sound recognition, third party damage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018119 Automotive Emotions: An Investigation of Their Natures, Frequencies of Occurrence and Causes
Authors: Marlene Weber, Joseph Giacomin, Alessio Malizia, Lee Skrypchuk, Voula Gkatzidou
Abstract:
Technological and sociological developments in the automotive sector are shifting the focus of design towards developing a better understanding of driver needs, desires and emotions. Human centred design methods are being more frequently applied to automotive research, including the use of systems to detect human emotions in real-time. One method for a non-contact measurement of emotion with low intrusiveness is Facial-Expression Analysis (FEA). This paper describes a research study investigating emotional responses of 22 participants in a naturalistic driving environment by applying a multi-method approach. The research explored the possibility to investigate emotional responses and their frequencies during naturalistic driving through real-time FEA. Observational analysis was conducted to assign causes to the collected emotional responses. In total, 730 emotional responses were measured in the collective study time of 440 minutes. Causes were assigned to 92% of the measured emotional responses. This research establishes and validates a methodology for the study of emotions and their causes in the driving environment through which systems and factors causing positive and negative emotional effects can be identified.Keywords: Affective computing, case study, emotion recognition, human computer interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 878118 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500
Authors: Mustafa Elfituri, Jonathan Cook
Abstract:
Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.
Keywords: Graph computation, Graph500 benchmark, parallel architectures, parallel programming, workload characterization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 548117 An Analysis of Uncoupled Designs in Chicken Egg
Authors: Pratap Sriram Sundar, Chandan Chowdhury, Sagar Kamarthi
Abstract:
Nature has perfected her designs over 3.5 billion years of evolution. Research fields such as biomimicry, biomimetics, bionics, bio-inspired computing, and nature-inspired designs have explored nature-made artifacts and systems to understand nature’s mechanisms and intelligence. Learning from nature, the researchers have generated sustainable designs and innovation in a variety of fields such as energy, architecture, agriculture, transportation, communication, and medicine. Axiomatic design offers a method to judge if a design is good. This paper analyzes design aspects of one of the nature’s amazing object: chicken egg. The functional requirements (FRs) of components of the object are tabulated and mapped on to nature-chosen design parameters (DPs). The ‘independence axiom’ of the axiomatic design methodology is applied to analyze couplings and to evaluate if eggs’ design is good (i.e., uncoupled design) or bad (i.e., coupled design). The analysis revealed that eggs design is a good design, i.e., uncoupled design. This approach can be applied to any nature’s artifacts to judge whether their design is a good or a bad. This methodology is valuable for biomimicry studies. This approach can also be a very useful teaching design consideration of biology and bio-inspired innovation.Keywords: Uncoupled design, axiomatic design, nature design, design evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 684116 Consumer Perception of 3D Body Scanning While Online Shopping for Clothing
Authors: A. Grilec, S. Petrak, M. Mahnic Naglic
Abstract:
Technological development and the globalization in production and sales of clothing in the last decade have significantly influenced the changes in consumer relationship with the industrial-fashioned apparel and in the way of clothing purchasing. The Internet sale of clothing is in a constant and significant increase in the global market, but the possibilities offered by modern computing technologies in the customization segment are not yet fully involved, especially according to the individual customer requirements and body sizes. Considering the growing trend of online shopping, the main goal of this paper is to investigate the differences in customer perceptions towards online apparel shopping and particularly to discover the main differences in perceptions between customers regarding three different body sizes. In order to complete the research goal, the quantitative study on the sample of 85 Croatian consumers was conducted in 2017 in Zagreb, Croatia. Respondents were asked to indicate their level of agreement according to a five-point Likert scale ranging from strongly disagree (1) to strongly agree (5). To analyze attitudes of respondents, simple and descriptive statistics were used. The main findings highlight the differences in respondent perception of 3D body scanning, using 3D body scanning in Internet shopping, online apparel shopping habits regarding their body sizes.
Keywords: Consumer behavior, online shopping, 3D body scanning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 748115 A Complexity-Based Approach in Image Compression using Neural Networks
Authors: Hadi Veisi, Mansour Jamzad
Abstract:
In this paper we present an adaptive method for image compression that is based on complexity level of the image. The basic compressor/de-compressor structure of this method is a multilayer perceptron artificial neural network. In adaptive approach different Back-Propagation artificial neural networks are used as compressor and de-compressor and this is done by dividing the image into blocks, computing the complexity of each block and then selecting one network for each block according to its complexity value. Three complexity measure methods, called Entropy, Activity and Pattern-based are used to determine the level of complexity in image blocks and their ability in complexity estimation are evaluated and compared. In training and evaluation, each image block is assigned to a network based on its complexity value. Best-SNR is another alternative in selecting compressor network for image blocks in evolution phase which chooses one of the trained networks such that results best SNR in compressing the input image block. In our evaluations, best results are obtained when overlapping the blocks is allowed and choosing the networks in compressor is based on the Best-SNR. In this case, the results demonstrate superiority of this method comparing with previous similar works and JPEG standard coding.Keywords: Adaptive image compression, Image complexity, Multi-layer perceptron neural network, JPEG Standard, PSNR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2222114 RadMote: A Mobile Framework for Radiation Monitoring in Nuclear Power Plants
Authors: Javier Barbaran, Manuel Dıaz, Inaki Esteve, Bartolome Rubio
Abstract:
Wireless Sensor Networks (WSNs) have attracted the attention of many researchers. This has resulted in their rapid integration in very different areas such as precision agriculture,environmental monitoring, object and event detection and military surveillance. Due to the current WSN characteristics this technology is specifically useful in industrial areas where security, reliability and autonomy are basic, such as nuclear power plants, chemical plants, and others. In this paper we present a system based on WSNs to monitor environmental conditions around and inside a nuclear power plant, specifically, radiation levels. Sensor nodes, equipped with radiation sensors, are deployed in fixed positions throughout the plant. In addition, plant staff are also equipped with mobile devices with higher capabilities than sensors such as for example PDAs able to monitor radiation levels and other conditions around them. The system enables communication between PDAs, which form a Mobile Ad-hoc Wireless Network (MANET), and allows workers to monitor remote conditions in the plant. It is particularly useful during stoppage periods for inspection or in the event of an accident to prevent risk situations.Keywords: MANETs, Mobile computing, Radiation monitoring, Wireless Sensor Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2017113 Computing Transition Intensity Using Time-Homogeneous Markov Jump Process: Case of South African HIV/AIDS Disposition
Authors: A. Bayaga
Abstract:
This research provides a technical account of estimating Transition Probability using Time-homogeneous Markov Jump Process applying by South African HIV/AIDS data from the Statistics South Africa. It employs Maximum Likelihood Estimator (MLE) model to explore the possible influence of Transition Probability of mortality cases in which case the data was based on actual Statistics South Africa. This was conducted via an integrated demographic and epidemiological model of South African HIV/AIDS epidemic. The model was fitted to age-specific HIV prevalence data and recorded death data using MLE model. Though the previous model results suggest HIV in South Africa has declined and AIDS mortality rates have declined since 2002 – 2013, in contrast, our results differ evidently with the generally accepted HIV models (Spectrum/EPP and ASSA2008) in South Africa. However, there is the need for supplementary research to be conducted to enhance the demographic parameters in the model and as well apply it to each of the nine (9) provinces of South Africa.
Keywords: AIDS mortality rates, Epidemiological model, Time-homogeneous Markov Jump Process, Transition Probability, Statistics South Africa.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2171112 Real-time 3D Feature Extraction without Explicit 3D Object Reconstruction
Authors: Kwangjin Hong, Chulhan Lee, Keechul Jung, Kyoungsu Oh
Abstract:
For the communication between human and computer in an interactive computing environment, the gesture recognition is studied vigorously. Therefore, a lot of studies have proposed efficient methods about the recognition algorithm using 2D camera captured images. However, there is a limitation to these methods, such as the extracted features cannot fully represent the object in real world. Although many studies used 3D features instead of 2D features for more accurate gesture recognition, the problem, such as the processing time to generate 3D objects, is still unsolved in related researches. Therefore we propose a method to extract the 3D features combined with the 3D object reconstruction. This method uses the modified GPU-based visual hull generation algorithm which disables unnecessary processes, such as the texture calculation to generate three kinds of 3D projection maps as the 3D feature: a nearest boundary, a farthest boundary, and a thickness of the object projected on the base-plane. In the section of experimental results, we present results of proposed method on eight human postures: T shape, both hands up, right hand up, left hand up, hands front, stand, sit and bend, and compare the computational time of the proposed method with that of the previous methods.Keywords: Fast 3D Feature Extraction, Gesture Recognition, Computer Vision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638111 Automatic Verification Technology of Virtual Machine Software Patch on IaaS Cloud
Authors: Yoji Yamato
Abstract:
In this paper, we propose an automatic verification technology of software patches for user virtual environments on IaaS Cloud to decrease verification costs of patches. In these days, IaaS services have been spread and many users can customize virtual machines on IaaS Cloud like their own private servers. Regarding to software patches of OS or middleware installed on virtual machines, users need to adopt and verify these patches by themselves. This task increases operation costs of users. Our proposed method replicates user virtual environments, extracts verification test cases for user virtual environments from test case DB, distributes patches to virtual machines on replicated environments and conducts those test cases automatically on replicated environments. We have implemented the proposed method on OpenStack using Jenkins and confirmed the feasibility. Using the implementation, we confirmed the effectiveness of test case creation efforts by our proposed idea of 2-tier abstraction of software functions and test cases. We also evaluated the automatic verification performance of environment replications, test cases extractions and test cases conductions.
Keywords: OpenStack, Cloud Computing, Automatic verification, Jenkins.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172110 Flow Modeling and Runner Design Optimization in Turgo Water Turbines
Authors: John S. Anagnostopoulos, Dimitrios E. Papantonis
Abstract:
The incorporation of computational fluid dynamics in the design of modern hydraulic turbines appears to be necessary in order to improve their efficiency and cost-effectiveness beyond the traditional design practices. A numerical optimization methodology is developed and applied in the present work to a Turgo water turbine. The fluid is simulated by a Lagrangian mesh-free approach that can provide detailed information on the energy transfer and enhance the understanding of the complex, unsteady flow field, at very small computing cost. The runner blades are initially shaped according to hydrodynamics theory, and parameterized using Bezier polynomials and interpolation techniques. The use of a limited number of free design variables allows for various modifications of the standard blade shape, while stochastic optimization using evolutionary algorithms is implemented to find the best blade that maximizes the attainable hydraulic efficiency of the runner. The obtained optimal runner design achieves considerably higher efficiency than the standard one, and its numerically predicted performance is comparable to a real Turgo turbine, verifying the reliability and the prospects of the new methodology.Keywords: Turgo turbine, Lagrangian flow modeling, Surface parameterization, Design optimization, Evolutionary algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4057109 Transcriptional Evidence for the Involvement of MyD88 in Flagellin Recognition: Genomic Identification of Rock Bream MyD88 and Comparative Analysis
Authors: N. Umasuthan, S. D. N. K. Bathige, W. S. Thulasitha, I. Whang, J. Lee
Abstract:
The MyD88 is an evolutionarily conserved host-expressed adaptor protein that is essential for proper TLR/ IL1R immune-response signaling. A previously identified complete cDNA (1626 bp) of OfMyD88 comprised an ORF of 867 bp encoding a protein of 288 amino acids (32.9 kDa). The gDNA (3761 bp) of OfMyD88 revealed a quinquepartite genome organization composed of 5 exons (with the sizes of 310, 132, 178, 92 and 155 bp) separated by 4 introns. All the introns displayed splice signals consistent with the consensus GT/AG rule. A bipartite domain structure with two domains namely death domain (24-103) coded by 1st exon, and TIR domain (151-288) coded by last 3 exons were identified through in silico analysis. Moreover, homology modeling of these two domains revealed a similar quaternary folding nature between human and rock bream homologs. A comprehensive comparison of vertebrate MyD88 genes showed that they possess a 5-exonic structure.In this structure, the last three exons were strongly conserved, and this suggests that a rigid structure has been maintained during vertebrate evolution.A cluster of TATA box-like sequences were found 0.25 kb upstream of cDNA starting position. In addition, putative 5'-flanking region of OfMyD88 was predicted to have TFBS implicated with TLR signaling, including copies of NFkB1, APRF/ STAT3, Sp1, IRF1 and 2 and Stat1/2. Using qPCR technique, a ubiquitous mRNA expression was detected in liver and blood. Furthermore, a significantly up-regulated transcriptional expression of OfMyD88 was detected in head kidney (12-24 h; >2-fold), spleen (6 h; 1.5-fold), liver (3 h; 1.9-fold) and intestine (24 h; ~2-fold) post-Fla challenge. These data suggest a crucial role for MyD88 in antibacterial immunity of teleosts.
Keywords: MyD88, Innate immunity, Flagellin, Genomic analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908108 Nonlinear Effects in Stiffness Modeling of Robotic Manipulators
Authors: A. Pashkevich, A. Klimchik, D. Chablat
Abstract:
The paper focuses on the enhanced stiffness modeling of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by rigid links and perfect joints. In contrast to the conventional formulation, which is valid for the unloaded mode and small displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The developed numerical technique allows computing the static equilibrium and relevant force/torque reaction of the manipulator for any given displacement of the end-effector. This enables designer detecting essentially nonlinear effects in elastic behavior of manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of the dedicated matrix composed of the stiffness parameters of the virtual springs and the Jacobians/Hessians of the active and passive joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel manipulator of the Orthoglide familyKeywords: Robotic manipulators, Stiffness model, Loaded mode, Nonlinear effects, Buckling, Orthoglide manipulator
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458107 Harnessing the Power of AI: Transforming DevSecOps for Enhanced Cloud Security
Authors: Ashly Joseph, Jithu Paulose
Abstract:
The increased usage of cloud computing has revolutionized the IT landscape, but it has also raised new security concerns. DevSecOps emerged as a way for tackling these difficulties by integrating security into the software development process. However, the rising complexity and sophistication of cyber threats need more advanced solutions. This paper looks into the usage of artificial intelligence (AI) techniques in the DevSecOps framework to increase cloud security. This study uses quantitative and qualitative techniques to assess the usefulness of AI approaches such as machine learning, natural language processing, and deep learning in reducing security issues. This paper thoroughly examines the symbiotic relationship between AI and DevSecOps, concentrating on how AI may be seamlessly integrated into the continuous integration and continuous delivery (CI/CD) pipeline, automated security testing, and real-time monitoring methods. The findings emphasize AI's huge potential to improve threat detection, risk assessment, and incident response skills. Furthermore, the paper examines the implications and challenges of using AI in DevSecOps workflows, considering factors like as scalability, interpretability, and adaptability. This paper adds to a better understanding of AI's revolutionary role in cloud security and provides valuable insights for practitioners and scholars in the field.
Keywords: Cloud Security, DevSecOps, Artificial Intelligence, AI, Machine Learning, Natural Language Processing, NLP, cybersecurity, AI-driven Security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 135106 Designing a Patient Monitoring System Using Cloud and Semantic Web Technologies
Authors: Chryssa Thermolia, Ekaterini S. Bei, Stelios Sotiriadis, Kostas Stravoskoufos, Euripides G.M. Petrakis
Abstract:
Moving into a new era of healthcare, new tools and devices are developed to extend and improve health services, such as remote patient monitoring and risk prevention. In this concept, Internet of Things (IoT) and Cloud Computing present great advantages by providing remote and efficient services, as well as cooperation between patients, clinicians, researchers and other health professionals. This paper focuses on patients suffering from bipolar disorder, a brain disorder that belongs to a group of conditions called affective disorders, which is characterized by great mood swings. We exploit the advantages of Semantic Web and Cloud Technologies to develop a patient monitoring system to support clinicians. Based on intelligently filtering of evidence-knowledge and individual-specific information we aim to provide treatment notifications and recommended function tests at appropriate times or concluding into alerts for serious mood changes and patient’s nonresponse to treatment. We propose an architecture as the back-end part of a cloud platform for IoT, intertwining intelligence devices with patients’ daily routine and clinicians’ support.
Keywords: Bipolar disorder, intelligent systems patient monitoring, semantic web technologies, IoT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2442105 Two-Level Identification of HVAC Consumers for Demand Response Potential Estimation Based on Setpoint Change
Authors: M. Naserian, M. Jooshaki, M. Fotuhi-Firuzabad, M. Hossein Mohammadi Sanjani, A. Oraee
Abstract:
In recent years, the development of communication infrastructure and smart meters have facilitated the utilization of demand-side resources which can enhance stability and economic efficiency of power systems. Direct load control programs can play an important role in the utilization of demand-side resources in the residential sector. However, investments required for installing control equipment can be a limiting factor in the development of such demand response programs. Thus, selection of consumers with higher potentials is crucial to the success of a direct load control program. Heating, ventilation, and air conditioning (HVAC) systems, which due to the heat capacity of buildings feature relatively high flexibility, make up a major part of household consumption. Considering that the consumption of HVAC systems depends highly on the ambient temperature and bearing in mind the high investments required for control systems enabling direct load control demand response programs, in this paper, a solution is presented to uncover consumers with high air conditioner demand among a large number of consumers and to measure the demand response potential of such consumers. This can pave the way for estimating the investments needed for the implementation of direct load control programs for residential HVAC systems and for estimating the demand response potentials in a distribution system. In doing so, we first cluster consumers into several groups based on the correlation coefficients between hourly consumption data and hourly temperature data using K-means algorithm. Then, by applying a recent algorithm to the hourly consumption and temperature data, consumers with high air conditioner consumption are identified. Finally, demand response potential of such consumers is estimated based on the equivalent desired temperature setpoint changes.
Keywords: Data-driven analysis, demand response, direct load control, HVAC system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 240104 A Budget and Deadline Constrained Fault Tolerant Load Balanced Scheduling Algorithm for Computational Grids
Authors: P. Keerthika, P. Suresh
Abstract:
Grid is an environment with millions of resources which are dynamic and heterogeneous in nature. A computational grid is one in which the resources are computing nodes and is meant for applications that involves larger computations. A scheduling algorithm is said to be efficient if and only if it performs better resource allocation even in case of resource failure. Resource allocation is a tedious issue since it has to consider several requirements such as system load, processing cost and time, user’s deadline and resource failure. This work attempts in designing a resource allocation algorithm which is cost-effective and also targets at load balancing, fault tolerance and user satisfaction by considering the above requirements. The proposed Budget Constrained Load Balancing Fault Tolerant algorithm with user satisfaction (BLBFT) reduces the schedule makespan, schedule cost and task failure rate and improves resource utilization. Evaluation of the proposed BLBFT algorithm is done using Gridsim toolkit and the results are compared with the algorithms which separately concentrates on all these factors. The comparison results ensure that the proposed algorithm works better than its counterparts.Keywords: Grid Scheduling, Load Balancing, fault tolerance, makespan, cost, resource utilization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2129103 Discrete-time Phase and Delay Locked Loops Analyses in Tracking Mode
Authors: Jiri Sebesta
Abstract:
Phase locked loops (PLL) and delay locked loops (DLL) play an important role in establishing coherent references (phase of carrier and symbol timing) in digital communication systems. Fully digital receiver including digital carrier synchronizer and symbol timing synchronizer fulfils the conditions for universal multi-mode communication receiver with option of symbol rate setting over several digit places and long-term stability of requirement parameters. Afterwards it is necessary to realize PLL and DLL in synchronizer in digital form and to approach to these subsystems as a discrete representation of analog template. Analysis of discrete phase locked loop (DPLL) or discrete delay locked loop (DDLL) and technique to determine their characteristics based on analog (continuous-time) template is performed in this posed paper. There are derived transmission response and error function for 1st order discrete locked loop and resulting equations and graphical representations for 2nd order one. It is shown that the spectrum translation due to sampling takes effect at frequency characteristics computing for specific values of loop parameters.
Keywords: Carrier synchronization, coherent demodulation, software defined receiver, symbol timing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2627102 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.
Keywords: Security, internet of things, cloud computing, Stackelberg security game, machine learning, Naïve Q-learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647