Search results for: soft computing
757 An Off-the-Shelf Scheme for Dependable Grid Systems Using Virtualization
Authors: Toshinori Takabatake
Abstract:
Recently, grid computing has been widely focused on the science, industry, and business fields, which are required a vast amount of computing. Grid computing is to provide the environment that many nodes (i.e., many computers) are connected with each other through a local/global network and it is available for many users. In the environment, to achieve data processing among nodes for any applications, each node executes mutual authentication by using certificates which published from the Certificate Authority (for short, CA). However, if a failure or fault has occurred in the CA, any new certificates cannot be published from the CA. As a result, a new node cannot participate in the gird environment. In this paper, an off-the-shelf scheme for dependable grid systems using virtualization techniques is proposed and its implementation is verified. The proposed approach using the virtualization techniques is to restart an application, e.g., the CA, if it has failed. The system can tolerate a failure or fault if it has occurred in the CA. Since the proposed scheme is implemented at the application level easily, the cost of its implementation by the system builder hardly takes compared it with other methods. Simulation results show that the CA in the system can recover from its failure or fault.Keywords: grid computing, restarting application, certificate authority, virtualization, dependability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1378756 A Two Level Load Balancing Approach for Cloud Environment
Authors: Anurag Jain, Rajneesh Kumar
Abstract:
Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.
Keywords: Cloud Analyst, Cloud Computing, Join Idle Queue, Join Shortest Queue, Load balancing, Task Scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992755 Secure Hashing Algorithm and Advance Encryption Algorithm in Cloud Computing
Authors: Jaimin Patel
Abstract:
Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.
Keywords: Cloud computing, encryption algorithm, secure hashing algorithm, brute force attack, birthday attack, plaintext attack, man-in-the-middle attack.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748754 Cloud Computing for E-Learning with More Emphasis on Security Issues
Authors: Sajjad Hashemi, Seyyed Yasser Hashemi
Abstract:
In today's world, success of most systems depend on the use of new technologies and information technology (IT) which aimed to increase efficiency and satisfaction of users. One of the most important systems that use information technology to deliver services is the education system. But for educational services in the form of E-learning systems, hardware and software equipment should be containing high quality, which requires substantial investment. Because the vast majority of educational establishments can not invest in this area so the best way for them is reducing the costs and providing the E-learning services by using cloud computing. But according to the novelty of the cloud technology, it can create challenges and concerns that the most noted among them are security issues. Security concerns about cloud-based E-learning products are critical and security measures essential to protect valuable data of users from security vulnerabilities in products. Thus, the success of these products happened if customers meet security requirements then can overcome security threats. In this paper tried to explore cloud computing and its positive impact on E- learning and put main focus to identify security issues that related to cloud-based E-learning efforts which have been improve security and provide solutions in management challenges.
Keywords: Cloud computing, E-Learning, Security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3220753 Heuristic Continuous-time Associative Memories
Authors: Truong Quang Dang Khoa, Masahiro Nakagawa
Abstract:
In this paper, a novel associative memory model will be proposed and applied to memory retrievals based on the conventional continuous time model. The conventional model presents memory capacity is very low and retrieval process easily converges to an equilibrium state which is very different from the stored patterns. Genetic Algorithms is well-known with the capability of global optimal search escaping local optimum on progress to reach a global optimum. Based on the well-known idea of Genetic Algorithms, this work proposes a heuristic rule to make a mutation when the state of the network is trapped in a spurious memory. The proposal heuristic associative memory show the stored capacity does not depend on the number of stored patterns and the retrieval ability is up to ~ 1.Keywords: Artificial Intelligent, Soft Computing, NeuralNetworks, Genetic Algorithms, Hopfield Neural Networks, andAssociative Memories.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403752 Development and Evaluation of a Nutraceutical Herbal Summer Drink
Authors: Munish Garg, Vinni Ahuja
Abstract:
In the past few years, high consumption of soft drinks has attracted negative attention world-wide due to its possible adverse effects, leading the health conscious people to find alternative nutraceutical or herbal health drinks. In the present study, a nutraceutical soft drink was developed utilizing some easily available and well known traditional herbs having nutritional potential. The key ingredients were selected as bael, amla, lemon juice, ashwagandha and poppy seeds based on their household routine use in the summer with proven refreshing, cooling and energetic feeling since ages. After several trials made, the final composition of nutraceutical summer soft drink was selected as most suitable combination based on the taste, physicochemical, microbial and organoleptic point of view. The physicochemical analysis of the prepared drink found to contain optimum level of titratable acidity, total soluble solids and pH which were in accordance of the commercial recommendations. There were no bacterial colonies found in the product therefore found within limits. During the nine point’s hedonic scale sensory evaluation, the drink was strongly liked for colour, taste, flavour and texture. The formulation was found to contain flavonoids (80mg/100ml), phenolics (103mg/100ml), vitamin C (250mg/100ml) and has antioxidant potential (75.52%) apart from providing several other essential vitamins, minerals and healthy components. The developed nutraceutical drink provides an economical and feasible option for the consumers with very good taste combined with potential health benefits. The present drink is potentially capable to replace the synthetic soft drinks available in the market.
Keywords: Herbal drink, nutraceuticals, summer drink, antioxidant.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3900751 Artificial Neural Networks and Multi-Class Support Vector Machines for Classifying Magnetic Measurements in Tokamak Reactors
Authors: A. Greco, N. Mammone, F.C. Morabito, M.Versaci
Abstract:
This paper is mainly concerned with the application of a novel technique of data interpretation for classifying measurements of plasma columns in Tokamak reactors for nuclear fusion applications. The proposed method exploits several concepts derived from soft computing theory. In particular, Artificial Neural Networks and Multi-Class Support Vector Machines have been exploited to classify magnetic variables useful to determine shape and position of the plasma with a reduced computational complexity. The proposed technique is used to analyze simulated databases of plasma equilibria based on ITER geometry configuration. As well as demonstrating the successful recovery of scalar equilibrium parameters, we show that the technique can yield practical advantages compared with earlier methods.Keywords: Tokamak, Classification, Artificial Neural Network, Support Vector Machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1279750 Classification of Attaks over Cloud Environment
Authors: Karim Abouelmehdi, Loubna Dali, Elmoutaoukkil Abdelmajid, Hoda Elsayed Eladnani Fatiha, Benihssane Abderahim
Abstract:
The security of cloud services is the concern of cloud service providers. In this paper, we will mention different classifications of cloud attacks referred by specialized organizations. Each agency has its classification of well-defined properties. The purpose is to present a high-level classification of current research in cloud computing security. This classification is organized around attack strategies and corresponding defenses.Keywords: Cloud computing, security, classification, risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2083749 Determination of the Quality of the Machined Surface Using Fuzzy Logic
Authors: Dejan Tanikić, Jelena Đoković, Saša Kalinović, Miodrag Manić, Saša Ranđelović
Abstract:
This paper deals with measuring and modelling of the quality of the machined surface of the metal machining process. The average surface roughness (Ra) which represents the quality of the machined part was measured during the dry turning of the AISI 4140 steel. A large number of factors with the unknown relations among them influences this parameter, and that is why mathematical modelling is extremely complicated. Different values of cutting speed, feed rate, depth of cut (cutting regime) and workpiece hardness causes different surface roughness values. Modelling with soft computing techniques may be very useful in such cases. This paper presents the usage of the fuzzy logic-based system for determining metal machining process parameter in order to find the proper values of cutting regimes.
Keywords: Metal machining, surface roughness, fuzzy logic, process modelling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 692748 Etiquette Learning and Public Speaking: Early Etiquette Learning and Its Impact on Higher Education and Working Professionals
Authors: Simran Ballani
Abstract:
The purpose of this paper is to call education professionals to implement etiquette and public speaking skills for preschoolers, primary, middle and higher school students. In this paper the author aims to present importance of etiquette learning and public speaking curriculum for preschoolers, reflect on experiences from implementation of the curriculum and discuss the effect of the said implementation on higher education/global job market. Author’s aim to introduce this curriculum was to provide children with innovative learning and all around development. This training of soft skills at kindergarten level can have a long term effect on their social behaviors which in turn can contribute to professional success once they are ready for campus recruitment/global job markets. Additionally, if preschoolers learn polite, appropriate behavior at early age, it will enable them to become more socially attentive and display good manners as an adult. It is easier to nurture these skills in a child rather than changing bad manners at adulthood. Preschool/Kindergarten education can provide the platform for children to learn these crucial soft skills irrespective of the ethnicity, economic or social background they come from. These skills developed at such early years can go a long way to shape them into better and confident individuals. Unfortunately, accessibility of the etiquette learning and public speaking skill education is not standardized in pre-primary or primary level and most of the time embedding into the kindergarten curriculum is next to nil. All young children should be provided with equal opportunity to learn these soft skills which are essential for finding their place in job market.
Keywords: Etiquette learning, public speaking, preschoolers, overall child development, early childhood interventions, soft skills.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000747 Efficient Semi-Systolic Finite Field Multiplier Using Redundant Basis
Authors: Hyun-Ho Lee, Kee-Won Kim
Abstract:
The arithmetic operations over GF(2m) have been extensively used in error correcting codes and public-key cryptography schemes. Finite field arithmetic includes addition, multiplication, division and inversion operations. Addition is very simple and can be implemented with an extremely simple circuit. The other operations are much more complex. The multiplication is the most important for cryptosystems, such as the elliptic curve cryptosystem, since computing exponentiation, division, and computing multiplicative inverse can be performed by computing multiplication iteratively. In this paper, we present a parallel computation algorithm that operates Montgomery multiplication over finite field using redundant basis. Also, based on the multiplication algorithm, we present an efficient semi-systolic multiplier over finite field. The multiplier has less space and time complexities compared to related multipliers. As compared to the corresponding existing structures, the multiplier saves at least 5% area, 50% time, and 53% area-time (AT) complexity. Accordingly, it is well suited for VLSI implementation and can be easily applied as a basic component for computing complex operations over finite field, such as inversion and division operation.Keywords: Finite field, Montgomery multiplication, systolic array, cryptography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1646746 On the Factors Affecting Computing Students’ Awareness of the Latest ICTs
Authors: O. D. Adegbehingbe, S. D. Eyono Obono
Abstract:
The education sector is constantly faced with rapid changes in technologies in terms of ensuring that the curriculum is up to date and in terms of making sure that students are aware of these technological changes. This challenge can be seen as the motivation for this study, which is to examine the factors affecting computing students’ awareness of the latest Information Technologies (ICTs). The aim of this study is divided into two sub-objectives which are: the selection of relevant theories and the design of a conceptual model to support it as well as the empirical testing of the designed model. The first objective is achieved by a review of existing literature on technology adoption theories and models. The second objective is achieved using a survey of computing students in the four universities of the KwaZulu-Natal province of South Africa. Data collected from this survey is analyzed using Statistical package for the Social Science (SPSS) using descriptive statistics, ANOVA and Pearson correlations. The main hypothesis of this study is that there is a relationship between the demographics and the prior conditions of the computing students and their awareness of general ICT trends and of Digital Switch Over (DSO) a new technology which involves the change from analog to digital television broadcasting in order to achieve improved spectrum efficiency. The prior conditions of the computing students that were considered in this study are students’ perceived exposure to career guidance and students’ perceived curriculum currency. The results of this study confirm that gender, ethnicity, and high school computing course affect students’ perceived curriculum currency while high school location affects students’ awareness of DSO. The results of this study also confirm that there is a relationship between students prior conditions and their awareness of general ICT trends and DSO in particular.Keywords: Education, Information Technologies, IDT, awareness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2231745 Grid-HPA: Predicting Resource Requirements of a Job in the Grid Computing Environment
Authors: M. Bohlouli, M. Analoui
Abstract:
For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.
Keywords: Active Database, Grid Computing, ResourceRequirement Prediction, Scheduling,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1432744 An Algorithm for Computing the Analytic Singular Value Decomposition
Authors: Drahoslava Janovska, Vladimir Janovsky, Kunio Tanabe
Abstract:
A proof of convergence of a new continuation algorithm for computing the Analytic SVD for a large sparse parameter– dependent matrix is given. The algorithm itself was developed and numerically tested in [5].
Keywords: Analytic Singular Value Decomposition, large sparse parameter–dependent matrices, continuation algorithm of a predictorcorrector type.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1457743 Software Maintenance Severity Prediction with Soft Computing Approach
Authors: E. Ardil, Erdem Uçar, Parvinder S. Sandhu
Abstract:
As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done on time especially for the critical applications. In this paper, we have explored the different predictor models to NASA-s public domain defect dataset coded in Perl programming language. Different machine learning algorithms belonging to the different learner categories of the WEKA project including Mamdani Based Fuzzy Inference System and Neuro-fuzzy based system have been evaluated for the modeling of maintenance severity or impact of fault severity. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provides relatively better prediction accuracy as compared to other models and hence, can be used for the maintenance severity prediction of the software.Keywords: Software Metrics, Fuzzy, Neuro-Fuzzy, SoftwareFaults, Accuracy, MAE, RMSE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581742 Fast 3D Collision Detection Algorithm using 2D Intersection Area
Authors: Taehyun Yoon, Keechul Jung
Abstract:
There are many researches to detect collision between real object and virtual object in 3D space. In general, these techniques are need to huge computing power. So, many research and study are constructed by using cloud computing, network computing, and distribute computing. As a reason of these, this paper proposed a novel fast 3D collision detection algorithm between real and virtual object using 2D intersection area. Proposed algorithm uses 4 multiple cameras and coarse-and-fine method to improve accuracy and speed performance of collision detection. In the coarse step, this system examines the intersection area between real and virtual object silhouettes from all camera views. The result of this step is the index of virtual sensors which has a possibility of collision in 3D space. To decide collision accurately, at the fine step, this system examines the collision detection in 3D space by using the visual hull algorithm. Performance of the algorithm is verified by comparing with existing algorithm. We believe proposed algorithm help many other research, study and application fields such as HCI, augmented reality, intelligent space, and so on.
Keywords: Collision Detection, Computer Vision, Human Computer Interaction, Visual Hull
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2406741 Exploring the Combinatorics of Motif Alignments Foraccurately Computing E-values from P-values
Authors: T. Kjosmoen, T. Ryen, T. Eftestøl
Abstract:
In biological and biomedical research motif finding tools are important in locating regulatory elements in DNA sequences. There are many such motif finding tools available, which often yield position weight matrices and significance indicators. These indicators, p-values and E-values, describe the likelihood that a motif alignment is generated by the background process, and the expected number of occurrences of the motif in the data set, respectively. The various tools often estimate these indicators differently, making them not directly comparable. One approach for comparing motifs from different tools, is computing the E-value as the product of the p-value and the number of possible alignments in the data set. In this paper we explore the combinatorics of the motif alignment models OOPS, ZOOPS, and ANR, and propose a generic algorithm for computing the number of possible combinations accurately. We also show that using the wrong alignment model can give E-values that significantly diverge from their true values.
Keywords: Motif alignment, combinatorics, p-value, E-value, OOPS, ZOOPS, ANR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1212740 Implementation of Watch Dog Timer for Fault Tolerant Computing on Cluster Server
Authors: Meenakshi Bheevgade, Rajendra M. Patrikar
Abstract:
In today-s new technology era, cluster has become a necessity for the modern computing and data applications since many applications take more time (even days or months) for computation. Although after parallelization, computation speeds up, still time required for much application can be more. Thus, reliability of the cluster becomes very important issue and implementation of fault tolerant mechanism becomes essential. The difficulty in designing a fault tolerant cluster system increases with the difficulties of various failures. The most imperative obsession is that the algorithm, which avoids a simple failure in a system, must tolerate the more severe failures. In this paper, we implemented the theory of watchdog timer in a parallel environment, to take care of failures. Implementation of simple algorithm in our project helps us to take care of different types of failures; consequently, we found that the reliability of this cluster improves.Keywords: Cluster, Fault tolerant, Grid, Grid ComputingSystem, Meta-computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2214739 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics
Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood
Abstract:
We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.Keywords: Cloud computing, cybersecurity, digital forensics, Kafka, Kubernetes, Spark.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659738 A Statistical Model for the Geotechnical Parameters of Cement-Stabilised Hightown’s Soft Soil: A Case Stufy of Liverpool, UK
Authors: Hassnen M. Jafer, Khalid S. Hashim, W. Atherton, Ali W. Alattabi
Abstract:
This study investigates the effect of two important parameters (length of curing period and percentage of the added binder) on the strength of soil treated with OPC. An intermediate plasticity silty clayey soil with medium organic content was used in this study. This soft soil was treated with different percentages of a commercially available cement type 32.5-N. laboratory experiments were carried out on the soil treated with 0, 1.5, 3, 6, 9, and 12% OPC by the dry weight to determine the effect of OPC on the compaction parameters, consistency limits, and the compressive strength. Unconfined compressive strength (UCS) test was carried out on cement-treated specimens after exposing them to different curing periods (1, 3, 7, 14, 28, and 90 days). The results of UCS test were used to develop a non-linear multi-regression model to find the relationship between the predicted and the measured maximum compressive strength of the treated soil (qu). The results indicated that there was a significant improvement in the index of plasticity (IP) by treating with OPC; IP was decreased from 20.2 to 14.1 by using 12% of OPC; this percentage was enough to increase the UCS of the treated soil up to 1362 kPa after 90 days of curing. With respect to the statistical model of the predicted qu, the results showed that the regression coefficients (R2) was equal to 0.8534 which indicates a good reproducibility for the constructed model.Keywords: Cement admixtures, soft soil stabilisation, geotechnical parameters, unconfined compressive strength, multi-regression model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393737 Computing Visibility Subsets in an Orthogonal Polyhedron
Authors: Jefri Marzal, Hong Xie, Chun Che Fung
Abstract:
Visibility problems are central to many computational geometry applications. One of the typical visibility problems is computing the view from a given point. In this paper, a linear time procedure is proposed to compute the visibility subsets from a corner of a rectangular prism in an orthogonal polyhedron. The proposed algorithm could be useful to solve classic 3D problems.
Keywords: Visibility, rectangular prism, orthogonal polyhedron.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1390736 Dynamic Load Balancing Strategy for Grid Computing
Authors: Belabbas Yagoubi, Yahya Slimani
Abstract:
Workload and resource management are two essential functions provided at the service level of the grid software infrastructure. To improve the global throughput of these software environments, workloads have to be evenly scheduled among the available resources. To realize this goal several load balancing strategies and algorithms have been proposed. Most strategies were developed in mind, assuming homogeneous set of sites linked with homogeneous and fast networks. However for computational grids we must address main new issues, namely: heterogeneity, scalability and adaptability. In this paper, we propose a layered algorithm which achieve dynamic load balancing in grid computing. Based on a tree model, our algorithm presents the following main features: (i) it is layered; (ii) it supports heterogeneity and scalability; and, (iii) it is totally independent from any physical architecture of a grid.
Keywords: Grid computing, load balancing, workload, tree based model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3141735 Soft Computing Based Cluster Head Selection in Wireless Sensor Network Using Bacterial Foraging Optimization Algorithm
Authors: A. Rajagopal, S. Somasundaram, B. Sowmya, T. Suguna
Abstract:
Wireless Sensor Networks (WSNs) enable new applications and need non-conventional paradigms for the protocol because of energy and bandwidth constraints, In WSN, sensor node’s life is a critical parameter. Research on life extension is based on Low-Energy Adaptive Clustering Hierarchy (LEACH) scheme, which rotates Cluster Head (CH) among sensor nodes to distribute energy consumption over all network nodes. CH selection in WSN affects network energy efficiency greatly. This study proposes an improved CH selection for efficient data aggregation in sensor networks. This new algorithm is based on Bacterial Foraging Optimization (BFO) incorporated in LEACH.Keywords: Bacterial Foraging Optimization (BFO), Cluster Head (CH), Data-aggregation protocols, Low-Energy Adaptive Clustering Hierarchy (LEACH).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3480734 A Novel Zero Voltage Transition Synchronous Buck Converter for Portable Application
Authors: S. Pattnaik, A. K. Panda, Aroul K., K. K. Mahapatra
Abstract:
This paper proposes a zero-voltage transition (ZVT) PWM synchronous buck converter, which is designed to operate at low output voltage and high efficiency typically required for portable systems. To make the DC-DC converter efficient at lower voltage, synchronous converter is an obvious choice because of lower conduction loss in the diode. The high-side MOSFET is dominated by the switching losses and it is eliminated by the soft switching technique. Additionally, the resonant auxiliary circuit designed is also devoid of the switching losses. The suggested procedure ensures an efficient converter. Theoretical analysis, computer simulation, and experimental results are presented to explain the proposed schemes.
Keywords: DC-DC Converter, Switching loss, Synchronous Buck, Soft switching, ZVT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3152733 Performance Evaluation of Data Transfer Protocol GridFTP for Grid Computing
Authors: Hiroyuki Ohsaki, Makoto Imase
Abstract:
In Grid computing, a data transfer protocol called GridFTP has been widely used for efficiently transferring a large volume of data. Currently, two versions of GridFTP protocols, GridFTP version 1 (GridFTP v1) and GridFTP version 2 (GridFTP v2), have been proposed in the GGF. GridFTP v2 supports several advanced features such as data streaming, dynamic resource allocation, and checksum transfer, by defining a transfer mode called X-block mode. However, in the literature, effectiveness of GridFTP v2 has not been fully investigated. In this paper, we therefore quantitatively evaluate performance of GridFTP v1 and GridFTP v2 using mathematical analysis and simulation experiments. We reveal the performance limitation of GridFTP v1, and quantitatively show effectiveness of GridFTP v2. Through several numerical examples, we show that by utilizing the data streaming feature, the average file transfer time of GridFTP v2 is significantly smaller than that of GridFTP v1.Keywords: Grid Computing, GridFTP, Performance Evaluation, Queuing Theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412732 Accelerating the Uptake of Smart City Applications through Cloud Computing
Authors: Panagiotis Tsarchopoulos, Nicos Komninos, Christina Kakderi
Abstract:
Smart cities are high on the political agenda around the globe. However, planning smart cities and deploying applications dealing with the complex problems of the urban environment is a very challenging task that is difficult to be undertaken solely by the cities. We argue that the uptake of smart city strategies is facilitated, first, through the development of smart city application repositories allowing re-use of already developed and tested software, and, second, through cloud computing which disengages city authorities from any resource constraints, technical or financial, and has a higher impact and greater effect at the city level The combination of these two solutions allows city governments and municipalities to select and deploy a large number of applications dedicated to different city functions, which collectively could create a multiplier effect with a greater impact on the urban environment.Keywords: Smart cities, applications, cloud computing, migration to the cloud, application repositories.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2283731 Investigation of the Physical Computing in Computational Thinking Practices, Computer Programming Concepts and Self-Efficacy for Crosscutting Ideas in STEM Content Environments
Authors: Sarantos Psycharis
Abstract:
Physical Computing, as an instructional model, is applied in the framework of the Engineering Pedagogy to teach “transversal/cross-cutting ideas” in a STEM content approach. Labview and Arduino were used in order to connect the physical world with real data in the framework of the so called Computational Experiment. Tertiary prospective engineering educators were engaged during their course and Computational Thinking (CT) concepts were registered before and after the intervention across didactic activities using validated questionnaires for the relationship between self-efficacy, computer programming, and CT concepts when STEM content epistemology is implemented in alignment with the Computational Pedagogy model. Results show a significant change in students’ responses for self-efficacy for CT before and after the instruction. Results also indicate a significant relation between the responses in the different CT concepts/practices. According to the findings, STEM content epistemology combined with Physical Computing should be a good candidate as a learning and teaching approach in university settings that enhances students’ engagement in CT concepts/practices.
Keywords: STEM, computational thinking, physical computing, Arduino, Labview, self-efficacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 815730 Influence of Cell-free Proteins in the Nucleation of CaCO3 Crystals in Calcified Endoskeleton
Authors: M. Azizur Rahman, Tamotsu Oomori
Abstract:
Calcite aCalcite and aragonite are the two common polymorphs of CaCO3 observed as biominerals. It is universal that the sea water contents a high Mg2+ (50mM) relative to Ca2+ (10mM). In vivo crystallization, Mg2+ inhibits calcite formation. For this reason, stony corals skeleton may be formed only aragonite crystals in the biocalcification. It is special in case of soft corals of which formed only calcite crystal; however, this interesting phenomenon, still uncharacterized in the marine environment, has been explored in this study using newly purified cell-free proteins isolated from the endoskeletal sclerites of soft coral. By recording the decline of pH in vitro, the control of CaCO3 nucleation and crystal growth by the cellfree proteins was revealed. Using Atomic Force Microscope, here we find that these endoskeletal cell-free proteins significantly design the morphological shape in the molecular-scale kinetics of crystal formation and those proteins act as surfactants to promote ion attachment at calcite steps.nd aragonite are the two common polymorphs of CaCO3 observed as biominerals. It is universal that the sea water contents a high Mg2+ (50mM) relative to Ca2+ (10mM). In vivo crystallization, Mg2+ inhibits calcite formation. For this reason, stony corals skeleton may be formed only aragonite crystals in the biocalcification. It is special in case of soft corals of which formed only calcite crystal; however, this interesting phenomenon, still uncharacterized in the marine environment, has been explored in this study using newly purified cell-free proteins isolated from the endoskeletal sclerites of soft coral. By recording the decline of pH in vitro, the control of CaCO3 nucleation and crystal growth by the cell-free proteins was revealed. Using Atomic Force Microscope, here we find that these endoskeletal cell-free proteins significantly design the morphological shape in the molecular-scale kinetics of crystal formation and those proteins act as surfactants to promote ion attachment at calcite steps. KeywordsBiomineralization, Calcite, Cell-free protein, Soft coralKeywords: Biomineralization, Calcite, Cell-free protein, Soft coral
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552729 FEA- Aided Design, Optimization and Development of an Axial Flux Motor for Implantable Ventricular Assist Device
Authors: Neethu S., Shinoy K.S., A.S. Shajilal
Abstract:
This paper presents the optimal design and development of an axial flux motor for blood pump application. With the design objective of maximizing the motor efficiency and torque, different topologies of AFPM machine has been examined. Selection of optimal magnet fraction, Halbach arrangement of rotor magnets and the use of Soft Magnetic Composite (SMC) material for the stator core results in a novel motor with improved efficiency and torque profile. The results of the 3D Finite element analysis for the novel motor have been shown.Keywords: Axial flux motor, Finite Element Methods, Halbach array, Left Ventricular Assist Device, Soft magnetic composite.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2191728 Biomechanical Prediction of Veins and Soft Tissues beneath Compression Stockings Using Fluid-Solid Interaction Model
Authors: Chongyang Ye, Rong Liu
Abstract:
Elastic compression stockings (ECSs) have been widely applied in prophylaxis and treatment of chronic venous insufficiency of lower extremities. The medical function of ECS is to improve venous return and increase muscular pumping action to facilitate blood circulation, which is largely determined by the complex interaction between the ECS and lower limb tissues. Understanding the mechanical transmission of ECS along the skin surface, deeper tissues, and vascular system is essential to assess the effectiveness of the ECSs. In this study, a three-dimensional (3D) finite element (FE) model of the leg-ECS system integrated with a 3D fluid-solid interaction (FSI) model of the leg-vein system was constructed to analyze the biomechanical properties of veins and soft tissues under different ECS compression. The Magnetic Resonance Imaging (MRI) of the human leg was divided into three regions, including soft tissues, bones (tibia and fibula) and veins (peroneal vein, great saphenous vein, and small saphenous vein). The ECSs with pressure ranges from 15 to 26 mmHg (Classes I and II) were adopted in the developed FE-FSI model. The soft tissue was assumed as a Neo-Hookean hyperelastic model with the fixed bones, and the ECSs were regarded as an orthotropic elastic shell. The interfacial pressure and stress transmission were simulated by the FE model, and venous hemodynamics properties were simulated by the FSI model. The experimental validation indicated that the simulated interfacial pressure distributions were in accordance with the pressure measurement results. The developed model can be used to predict interfacial pressure, stress transmission, and venous hemodynamics exerted by ECSs and optimize the structure and materials properties of ECSs design, thus improving the efficiency of compression therapy.Keywords: Elastic compression stockings, fluid-solid interaction, tissue and vein properties, prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 611