Search results for: Soft Computing.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 925

Search results for: Soft Computing.

685 Governance, Risk Management, and Compliance Factors Influencing the Adoption of Cloud Computing in Australia

Authors: Tim Nedyalkov

Abstract:

A business decision to move to the cloud brings fundamental changes in how an organization develops and delivers its Information Technology solutions. The accelerated pace of digital transformation across businesses and government agencies increases the reliance on cloud-based services. Collecting, managing, and retaining large amounts of data in cloud environments make information security and data privacy protection essential. It becomes even more important to understand what key factors drive successful cloud adoption following the commencement of the Privacy Amendment Notifiable Data Breaches (NDB) Act 2017 in Australia as the regulatory changes impact many organizations and industries. This quantitative correlational research investigated the governance, risk management, and compliance factors contributing to cloud security success. The factors influence the adoption of cloud computing within an organizational context after the commencement of the NDB scheme. The results and findings demonstrated that corporate information security policies, data storage location, management understanding of data governance responsibilities, and regular compliance assessments are the factors influencing cloud computing adoption. The research has implications for organizations, future researchers, practitioners, policymakers, and cloud computing providers to meet the rapidly changing regulatory and compliance requirements.

Keywords: Cloud compliance, cloud security, cloud security governance, data governance, privacy protection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 837
684 Context Modeling and Context-Aware Service Adaptation for Pervasive Computing Systems

Authors: Moeiz Miraoui, Chakib Tadj, Chokri ben Amar

Abstract:

Devices in a pervasive computing system (PCS) are characterized by their context-awareness. It permits them to provide proactively adapted services to the user and applications. To do so, context must be well understood and modeled in an appropriate form which enhance its sharing between devices and provide a high level of abstraction. The most interesting methods for modeling context are those based on ontology however the majority of the proposed methods fail in proposing a generic ontology for context which limit their usability and keep them specific to a particular domain. The adaptation task must be done automatically and without an explicit intervention of the user. Devices of a PCS must acquire some intelligence which permits them to sense the current context and trigger the appropriate service or provide a service in a better suitable form. In this paper we will propose a generic service ontology for context modeling and a context-aware service adaptation based on a service oriented definition of context.

Keywords: Pervasive computing system, context, contextawareness, service, context modeling, ontology, adaptation, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1768
683 Agent Decision using Granular Computing in Traffic System

Authors: Yasser F. Hassan, Marwa Abdeen, Mustafa Fahmy

Abstract:

In recent years multi-agent systems have emerged as one of the interesting architectures facilitating distributed collaboration and distributed problem solving. Each node (agent) of the network might pursue its own agenda, exploit its environment, develop its own problem solving strategy and establish required communication strategies. Within each node of the network, one could encounter a diversity of problem-solving approaches. Quite commonly the agents can realize their processing at the level of information granules that is the most suitable from their local points of view. Information granules can come at various levels of granularity. Each agent could exploit a certain formalism of information granulation engaging a machinery of fuzzy sets, interval analysis, rough sets, just to name a few dominant technologies of granular computing. Having this in mind, arises a fundamental issue of forming effective interaction linkages between the agents so that they fully broadcast their findings and benefit from interacting with others.

Keywords: Granular computing, rough sets, agents, traffic system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1686
682 Enabling Remote Desktop in a Virtualized Environment for Cloud Services

Authors: Shuen-Tai Wang, Yu-Ching Lin, Hsi-Ya Chang

Abstract:

Cloud computing is the innovative and leading information technology model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort. This paper presents our development on enabling an individual user's desktop in a virtualized environment, which is stored on a remote virtual machine rather than locally. We present the initial work on the integration of virtual desktop and application sharing with virtualization technology. Given the development of remote desktop virtualization, this proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the cost of software licenses and platform maintenances. Moreover, this development also helps boost user productivity by promoting a flexible model that lets users access their desktop environments from virtually anywhere.

Keywords: Cloud Computing, Virtualization, Virtual Desktop, Elastic Environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2165
681 Organizational Data Security in Perspective of Ownership of Mobile Devices Used by Employees for Works

Authors: B. Ferdousi, J. Bari

Abstract:

With advancement of mobile computing, employees are increasingly doing their job-related works using personally owned mobile devices or organization owned devices. The Bring Your Own Device (BYOD) model allows employees to use their own mobile devices for job-related works, while Corporate Owned, Personally Enabled (COPE) model allows both organizations and employees to install applications onto organization-owned mobile devices used for job-related works. While there are many benefits of using mobile computing for job-related works, there are also serious concerns of different levels of threats to the organizational data security. Consequently, it is crucial to know the level of threat to the organizational data security in the BOYD and COPE models. It is also important to ensure that employees comply with the organizational data security policy. This paper discusses the organizational data security issues in perspective of ownership of mobile devices used by employees, especially in BYOD and COPE models. It appears that while the BYOD model has many benefits, there are relatively more data security risks in this model than in the COPE model. The findings also showed that in both BYOD and COPE environments, a more practical approach towards achieving secure mobile computing in organizational setting is through the development of comprehensive cybersecurity policies balancing employees’ need for convenience with organizational data security. The study helps to figure out the compliance and the risks of security breach in BYOD and COPE models.

Keywords: Data security, mobile computing, BYOD, COPE, cybersecurity policy, cybersecurity compliance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 272
680 Segmenting Ultrasound B-Mode Images Using RiIG Distributions and Stochastic Optimization

Authors: N. Mpofu, M. Sears

Abstract:

In this paper, we propose a novel algorithm for delineating the endocardial wall from a human heart ultrasound scan. We assume that the gray levels in the ultrasound images are independent and identically distributed random variables with different Rician Inverse Gaussian (RiIG) distributions. Both synthetic and real clinical data will be used for testing the algorithm. Algorithm performance will be evaluated using the expert radiologist evaluation of a soft copy of an ultrasound scan during the scanning process and secondly, doctor’s conclusion after going through a printed copy of the same scan. Successful implementation of this algorithm should make it possible to differentiate normal from abnormal soft tissue and help disease identification, what stage the disease is in and how best to treat the patient. We hope that an automated system that uses this algorithm will be useful in public hospitals especially in Third World countries where problems such as shortage of skilled radiologists and shortage of ultrasound machines are common. These public hospitals are usually the first and last stop for most patients in these countries.

Keywords: Endorcardial Wall, Rician Inverse Distributions, Segmentation, Ultrasound Images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1534
679 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: Co-scheduling, data-centric, data-intensive, data locality, in-memory storage, large scale.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
678 Influence of Local Soil Conditions on Optimal Load Factors for Seismic Design of Buildings

Authors: Miguel A. Orellana, Sonia E. Ruiz, Juan Bojórquez

Abstract:

Optimal load factors (dead, live and seismic) used for the design of buildings may be different, depending of the seismic ground motion characteristics to which they are subjected, which are closely related to the type of soil conditions where the structures are located. The influence of the type of soil on those load factors, is analyzed in the present study. A methodology that is useful for establishing optimal load factors that minimize the cost over the life cycle of the structure is employed; and as a restriction, it is established that the probability of structural failure must be less than or equal to a prescribed value. The life-cycle cost model used here includes different types of costs. The optimization methodology is applied to two groups of reinforced concrete buildings. One set (consisting on 4-, 7-, and 10-story buildings) is located on firm ground (with a dominant period Ts=0.5 s) and the other (consisting on 6-, 12-, and 16-story buildings) on soft soil (Ts=1.5 s) of Mexico City. Each group of buildings is designed using different combinations of load factors. The statistics of the maximums inter-story drifts (associated with the structural capacity) are found by means of incremental dynamic analyses. The buildings located on firm zone are analyzed under the action of 10 strong seismic records, and those on soft zone, under 13 strong ground motions. All the motions correspond to seismic subduction events with magnitudes M=6.9. Then, the structural damage and the expected total costs, corresponding to each group of buildings, are estimated. It is concluded that the optimal load factors combination is different for the design of buildings located on firm ground than that for buildings located on soft soil.

Keywords: Life-cycle cost, optimal load factors, reinforced concrete buildings, total costs, type of soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 859
677 VLSI Design of 2-D Discrete Wavelet Transform for Area-Efficient and High-Speed Image Computing

Authors: Mountassar Maamoun, Mehdi Neggazi, Abdelhamid Meraghni, Daoud Berkani

Abstract:

This paper presents a VLSI design approach of a highspeed and real-time 2-D Discrete Wavelet Transform computing. The proposed architecture, based on new and fast convolution approach, reduces the hardware complexity in addition to reduce the critical path to the multiplier delay. Furthermore, an advanced twodimensional (2-D) discrete wavelet transform (DWT) implementation, with an efficient memory area, is designed to produce one output in every clock cycle. As a result, a very highspeed is attained. The system is verified, using JPEG2000 coefficients filters, on Xilinx Virtex-II Field Programmable Gate Array (FPGA) device without accessing any external memory. The resulting computing rate is up to 270 M samples/s and the (9,7) 2-D wavelet filter uses only 18 kb of memory (16 kb of first-in-first-out memory) with 256×256 image size. In this way, the developed design requests reduced memory and provide very high-speed processing as well as high PSNR quality.

Keywords: Discrete Wavelet Transform (DWT), Fast Convolution, FPGA, VLSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
676 A Soft Switching PWM DC-DC Boost Converter with Increased Efficiency by Using ZVT-ZCT Techniques

Authors: Yakup Sahin, Naim Suleyman Ting, Ismail Aksoy

Abstract:

In this paper, an improved active snubber cell is proposed on account of soft switching (SS) family of pulse width modulation (PWM) DC-DC converters. The improved snubber cell provides zero-voltage transition (ZVT) turn on and zero-current transition (ZCT) turn off for main switch. The snubber cell decreases EMI noise and operates with SS in a wide range of line and load voltages. Besides, all of the semiconductor devices in the converter operate with SS. There is no additional voltage and current stress on the main devices. Additionally, extra voltage stress does not occur on the auxiliary switch and its current stress is acceptable value. The improved converter has a low cost and simple structure. The theoretical analysis of converter is clarified and the operating states are given in detail. The experimental results of converter are obtained by prototype of 500 W and 100 kHz. It is observed that the experimental results and theoretical analysis of converter are suitable with each other perfectly.

Keywords: Active snubber cells, DC-DC converters, zero-voltage transition, zero-current transition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1883
675 An Examination of the Factors Affecting the Adoption of Cloud Enterprise Resource Planning Systems in Egyptian Companies

Authors: Mayar A. Omar, Ismail Gomaa, Heba Badawy, Hosam Moubarak

Abstract:

Enterprise resource planning (ERP) is an integrated system that helps companies in managing their resources. There are two types of ERP systems, the traditional ERP systems, and the cloud ERP systems. Cloud ERP systems were introduced after the development of cloud computing technology. This research aims to identify the factors that affect the adoption of cloud ERP in Egyptian companies. Moreover, the aim of our study is to provide guidance to Egyptian companies in the cloud ERP adoption decision and to participate in increasing the number of the cloud ERP studies that are conducted in the Middle East and in developing countries. There are many factors influencing the adoption of cloud ERP in Egyptian organizations which are discussed and explained in the research. Those factors are examined through combining the Diffusion of Innovation theory (DOI) and technology-organization-environment framework (TOE). Data were collected through a survey that was developed using constructs from the existing studies of cloud computing and cloud ERP technologies and was then modified to fit our research. The analysis of the data was based on Structural Equation Modeling (SEM) using Smart PLS software that was used for the empirical analysis of the research model.

Keywords: cloud computing, cloud ERP systems, DOI, Egypt, SEM, TOE

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 777
674 Fuzzy Set Approach to Study Appositives and Its Impact Due to Positional Alterations

Authors: E. Mike Dison, T. Pathinathan

Abstract:

Computing with Words (CWW) and Possibilistic Relational Universal Fuzzy (PRUF) are the two concepts which widely represent and measure the vaguely defined natural phenomenon. In this paper, we study the positional alteration of the phrases by which the impact of a natural language proposition gets affected and/or modified. We observe the gradations due to sensitivity/feeling of a statement towards the positional alterations. We derive the classification and modification of the meaning of words due to the positional alteration. We present the results with reference to set theoretic interpretations.

Keywords: Appositive, computing with words, PRUF, semantic sentiment analysis, set theoretic interpretations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 782
673 Explorative Data Mining of Constructivist Learning Experiences and Activities with Multiple Dimensions

Authors: Patrick Wessa, Bart Baesens

Abstract:

This paper discusses the use of explorative data mining tools that allow the educator to explore new relationships between reported learning experiences and actual activities, even if there are multiple dimensions with a large number of measured items. The underlying technology is based on the so-called Compendium Platform for Reproducible Computing (http://www.freestatistics.org) which was built on top the computational R Framework (http://www.wessa.net).

Keywords: Reproducible computing, data mining, explorative data analysis, compendium technology, computer assisted education

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1212
672 Porous Carbon Nanoparticles Co-Doped with Nitrogen and Iron as an Efficient Catalyst for Oxygen Reduction Reaction

Authors: Bita Bayatsarmadi, Shi-Zhang Qiao

Abstract:

Oxygen Reduction Reaction (ORR) performance of iron and nitrogen co-doped porous carbon nanoparticles (Fe-NPC) with various physical and (electro) chemical properties have been investigated. Fe-NPC nanoparticles are synthesized via a facile soft-templating procedure by using Iron (III) chloride hexa-hydrate as iron precursor and aminophenol-formaldehyde resin as both carbon and nitrogen precursor. Fe-NPC nanoparticles shows high surface area (443.83 m2g-1), high pore volume (0.52 m3g-1), narrow mesopore size distribution (ca. 3.8 nm), high conductivity (IG/ID=1.04), high kinetic limiting current (11.71 mAcm-2) and more positive onset potential (-0.106 V) compared to metal-free NPC nanoparticles (-0.295V) which make it high efficient ORR metal-free catalysts in alkaline solution. This study may pave the way of feasibly designing iron and nitrogen containing carbon materials (Fe-N-C) for highly efficient oxygen reduction electro-catalysis.

Keywords: Electro-catalyst, mesopore structure, oxygen reduction reaction, soft-template.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1990
671 Operating System Based Virtualization Models in Cloud Computing

Authors: Dev Ras Pandey, Bharat Mishra, S. K. Tripathi

Abstract:

Cloud computing is ready to transform the structure of businesses and learning through supplying the real-time applications and provide an immediate help for small to medium sized businesses. The ability to run a hypervisor inside a virtual machine is important feature of virtualization and it is called nested virtualization. In today’s growing field of information technology, many of the virtualization models are available, that provide a convenient approach to implement, but decision for a single model selection is difficult. This paper explains the applications of operating system based virtualization in cloud computing with an appropriate/suitable model with their different specifications and user’s requirements. In the present paper, most popular models are selected, and the selection was based on container and hypervisor based virtualization. Selected models were compared with a wide range of user’s requirements as number of CPUs, memory size, nested virtualization supports, live migration and commercial supports, etc. and we identified a most suitable model of virtualization.

Keywords: Virtualization, OS based virtualization, container and hypervisor based virtualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
670 Regression Approach for Optimal Purchase of Hosts Cluster in Fixed Fund for Hadoop Big Data Platform

Authors: Haitao Yang, Jianming Lv, Fei Xu, Xintong Wang, Yilin Huang, Lanting Xia, Xuewu Zhu

Abstract:

Given a fixed fund, purchasing fewer hosts of higher capability or inversely more of lower capability is a must-be-made trade-off in practices for building a Hadoop big data platform. An exploratory study is presented for a Housing Big Data Platform project (HBDP), where typical big data computing is with SQL queries of aggregate, join, and space-time condition selections executed upon massive data from more than 10 million housing units. In HBDP, an empirical formula was introduced to predict the performance of host clusters potential for the intended typical big data computing, and it was shaped via a regression approach. With this empirical formula, it is easy to suggest an optimal cluster configuration. The investigation was based on a typical Hadoop computing ecosystem HDFS+Hive+Spark. A proper metric was raised to measure the performance of Hadoop clusters in HBDP, which was tested and compared with its predicted counterpart, on executing three kinds of typical SQL query tasks. Tests were conducted with respect to factors of CPU benchmark, memory size, virtual host division, and the number of element physical host in cluster. The research has been applied to practical cluster procurement for housing big data computing.

Keywords: Hadoop platform planning, optimal cluster scheme at fixed-fund, performance empirical formula, typical SQL query tasks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 789
669 Computational Feasibility Study of a Torsional Wave Transducer for Tissue Stiffness Monitoring

Authors: Rafael Muñoz, Juan Melchor, Alicia Valera, Laura Peralta, Guillermo Rus

Abstract:

A torsional piezoelectric ultrasonic transducer design is proposed to measure shear moduli in soft tissue with direct access availability, using shear wave elastography technique. The measurement of shear moduli of tissues is a challenging problem, mainly derived from a) the difficulty of isolating a pure shear wave, given the interference of multiple waves of different types (P, S, even guided) emitted by the transducers and reflected in geometric boundaries, and b) the highly attenuating nature of soft tissular materials. An immediate application, overcoming these drawbacks, is the measurement of changes in cervix stiffness to estimate the gestational age at delivery. The design has been optimized using a finite element model (FEM) and a semi-analytical estimator of the probability of detection (POD) to determine a suitable geometry, materials and generated waves. The technique is based on the time of flight measurement between emitter and receiver, to infer shear wave velocity. Current research is centered in prototype testing and validation. The geometric optimization of the transducer was able to annihilate the compressional wave emission, generating a quite pure shear torsional wave. Currently, mechanical and electromagnetic coupling between emitter and receiver signals are being the research focus. Conclusions: the design overcomes the main described problems. The almost pure shear torsional wave along with the short time of flight avoids the possibility of multiple wave interference. This short propagation distance reduce the effect of attenuation, and allow the emission of very low energies assuring a good biological security for human use.

Keywords: Cervix ripening, preterm birth, shear modulus, shear wave elastography, soft tissue, torsional wave.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523
668 To Cloudify or Not to Cloudify

Authors: Laila Yasir Al-Harthy, Ali H. Al-Badi

Abstract:

As an emerging business model, cloud computing has been initiated to satisfy the need of organizations and to push Information Technology as a utility. The shift to the cloud has changed the way Information Technology departments are managed traditionally and has raised many concerns for both, public and private sectors.

The purpose of this study is to investigate the possibility of cloud computing services replacing services provided traditionally by IT departments. Therefore, it aims to 1) explore whether organizations in Oman are ready to move to the cloud; 2) identify the deciding factors leading to the adoption or rejection of cloud computing services in Oman; and 3) provide two case studies, one for a successful Cloud provider and another for a successful adopter.

This paper is based on multiple research methods including conducting a set of interviews with cloud service providers and current cloud users in Oman; and collecting data using questionnaires from experts in the field and potential users of cloud services.

Despite the limitation of bandwidth capacity and Internet coverage offered in Oman that create a challenge in adopting the cloud, it was found that many information technology professionals are encouraged to move to the cloud while few are resistant to change.

The recent launch of a new Omani cloud service provider and the entrance of other international cloud service providers in the Omani market make this research extremely valuable as it aims to provide real-life experience as well as two case studies on the successful provision of cloud services and the successful adoption of these services.

Keywords: Cloud computing, cloud deployment models, cloud service models and deciding factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2259
667 Architecture Based on Dynamic Graphs for the Dynamic Reconfiguration of Farms of Computers

Authors: Carmen Navarrete, Eloy Anguiano

Abstract:

In the last years, the computers have increased their capacity of calculus and networks, for the interconnection of these machines. The networks have been improved until obtaining the actual high rates of data transferring. The programs that nowadays try to take advantage of these new technologies cannot be written using the traditional techniques of programming, since most of the algorithms were designed for being executed in an only processor,in a nonconcurrent form instead of being executed concurrently ina set of processors working and communicating through a network.This paper aims to present the ongoing development of a new system for the reconfiguration of grouping of computers, taking into account these new technologies.

Keywords: Dynamic network topology, resource and task allocation, parallel computing, heterogeneous computing, dynamic reconfiguration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1319
666 Laplace Decomposition Approximation Solution for a System of Multi-Pantograph Equations

Authors: M. A. Koroma, C. Zhan, A. F. Kamara, A. B. Sesay

Abstract:

In this work we adopt a combination of Laplace transform and the decomposition method to find numerical solutions of a system of multi-pantograph equations. The procedure leads to a rapid convergence of the series to the exact solution after computing a few terms. The effectiveness of the method is demonstrated in some examples by obtaining the exact solution and in others by computing the absolute error which decreases as the number of terms of the series increases.

Keywords: Laplace decomposition, pantograph equations, exact solution, numerical solution, approximate solution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
665 A Recommendation to Oncologists for Cancer Treatment by Immunotherapy: Quantitative and Qualitative Analysis

Authors: Mandana Kariminejad, Ali Ghaffari

Abstract:

Today, the treatment of cancer, in a relatively short period, with minimum adverse effects is a great concern for oncologists. In this paper, based on a recently used mathematical model for cancer, a guideline has been proposed for the amount and duration of drug doses for cancer treatment by immunotherapy. Dynamically speaking, the mathematical ordinary differential equation (ODE) model of cancer has different equilibrium points; one of them is unstable, which is called the no tumor equilibrium point. In this paper, based on the number of tumor cells an intelligent soft computing controller (a combination of fuzzy logic controller and genetic algorithm), decides regarding the amount and duration of drug doses, to eliminate the tumor cells and stabilize the unstable point in a relatively short time. Two different immunotherapy approaches; active and adoptive, have been studied and presented. It is shown that the rate of decay of tumor cells is faster and the doses of drug are lower in comparison with the result of some other literatures. It is also shown that the period of treatment and the doses of drug in adoptive immunotherapy are significantly less than the active method. A recommendation to oncologists has also been presented.

Keywords: Tumor, immunotherapy, fuzzy controller, Genetic algorithm, mathematical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1011
664 Distribution of Macrobenthic Polychaete Families in Relation to Environmental Parameters in North West Penang, Malaysia

Authors: Mohammad Gholizadeh, Khairun Yahya, Anita Talib, Omar Ahmad

Abstract:

The distribution of macrobenthic polychaetes along the coastal waters of Penang National Park was surveyed to estimate the effect of various environmental parameters at three stations (200m, 600m and 1200m) from the shoreline, during six sampling months, from June 2010 to April 2011.The use of polychaetes in descriptive ecology is surveyed in the light of a recent investigation particularly concerning the soft bottom biota environments. Polychaetes, often connected in the former to the notion of opportunistic species able to proliferate after an enhancement in organic matter, had performed a momentous role particularly with regard to effected soft-bottom habitats. The objective of this survey was to investigate different environment stress over soft bottom polychaete community along Teluk Ketapang and Pantai Acheh (Penang National Park) over a year period. Variations in the polychaete community were evaluated using univariate and multivariate methods. The results of PCA analysis displayed a positive relation between macrobenthic community structures and environmental parameters such as sediment particle size and organic matter in the coastal water. A total of 604 individuals were examined which was grouped into 23 families. Family Nereidae was the most abundant (22.68%), followed by Spionidae (22.02%), Hesionidae (12.58%), Nephtylidae (9.27%) and Orbiniidae (8.61%). It is noticeable that good results can only be obtained on the basis of good taxonomic resolution. We proposed that, in monitoring surveys, operative time could be optimized not only by working at a highertaxonomic level on the entire macrobenthic data set, but by also choosing an especially indicative group and working at lower taxonomic and good level.

Keywords: Polychaete families, environment parameters, Bioindicators, Pantai Acheh, Teluk Ketapang.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
663 Enhancing Security in Resource Sharing Using Key Holding Mechanism

Authors: M. Victor Jose, V. Seenivasagam

Abstract:

This paper describes a logical method to enhance security on the grid computing to restrict the misuse of the grid resources. This method is an economic and efficient one to avoid the usage of the special devices. The security issues, techniques and solutions needed to provide a secure grid computing environment are described. A well defined process for security management among the resource accesses and key holding algorithm is also proposed. In this method, the identity management, access control and authorization and authentication are effectively handled.

Keywords: Grid security, Irregular binary series, Key holding mechanism, Resource identity, Secure resource access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1691
662 Computing the Similarity and the Diversity in the Species Based on Cronobacter Genome

Authors: E. Al Daoud

Abstract:

The purpose of computing the similarity and the diversity in the species is to trace the process of evolution and to find the relationship between the species and discover the unique, the special, the common and the universal proteins. The proteins of the whole genome of 40 species are compared with the cronobacter genome which is used as reference genome. More than 3 billion pairwise alignments are performed using blastp. Several findings are introduced in this study, for example, we found 172 proteins in cronobacter genome which have insignificant hits in other species, 116 significant proteins in the all tested species with very high score value and 129 common proteins in the plants but have insignificant hits in mammals, birds, fishes, and insects.

Keywords: Genome, species, blastp, conserved genes, cronobacter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 963
661 Complex Condition Monitoring System of Aircraft Gas Turbine Engine

Authors: A. M. Pashayev, D. D. Askerov, C. Ardil, R. A. Sadiqov, P. S. Abdullayev

Abstract:

Researches show that probability-statistical methods application, especially at the early stage of the aviation Gas Turbine Engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods is considered. According to the purpose of this problem training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. For GTE technical condition more adequate model making dynamics of skewness and kurtosis coefficients- changes are analysed. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE workand output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-by-stage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine technical condition was made.

Keywords: aviation gas turbine engine, technical condition, fuzzy logic, neural networks, fuzzy statistics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2496
660 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine

Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji

Abstract:

The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.

Keywords: Medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2040
659 An Efficient Architecture for Dynamic Customization and Provisioning of Virtual Appliance in Cloud Environment

Authors: Rajendar Kandan, Mohammad Zakaria Alli, Hong Ong

Abstract:

Cloud computing is a business model which provides an easier management of computing resources. Cloud users can request virtual machine and install additional softwares and configure them if needed. However, user can also request virtual appliance which provides a better solution to deploy application in much faster time, as it is ready-built image of operating system with necessary softwares installed and configured. Large numbers of virtual appliances are available in different image format. User can download available appliances from public marketplace and start using it. However, information published about the virtual appliance differs from each providers leading to the difficulty in choosing required virtual appliance as it is composed of specific OS with standard software version. However, even if user choses the appliance from respective providers, user doesn’t have any flexibility to choose their own set of softwares with required OS and application. In this paper, we propose a referenced architecture for dynamically customizing virtual appliance and provision them in an easier manner. We also add our experience in integrating our proposed architecture with public marketplace and Mi-Cloud, a cloud management software.

Keywords: Cloud computing, marketplace, virtualization, virtual appliance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1766
658 Microbial Assessment of Dairy Byproducts in Albania as a Basis for Consumer Safety

Authors: Klementina Puto, Ermelinda Nexhipi, Evi Llaka

Abstract:

Dairy by-products are a fairly good environment for microorganisms due to their composition for their growth. Microbial populations have a significant impact in the production of cheese, butter, yogurt, etc. in terms of their organoleptic quality and at the same time some also cause their breakdown. In this paper, the microbiological contamination of soft cheese, butter and yogurt produced in the country (domestic) and imported is assessed, as an indicator of hygiene with impact on public health. The study was extended during September 2018-June 2019 and was divided into three periods, September-December, January-March, and April-June. During this study, a total of 120 samples were analyzed, of which 60 samples of cheese and butter locally produced, and 60 samples of imported soft cheese and butter productions. The microbial indicators analyzed are Staphylococcus aureus and E. coli. Analyzes have been conducted at the Food Safety Laboratory (FSIV) in Tirana in accordance with EU Regulation 2073/2005. Sampling was performed according to the specific international standards for these products (ISO 6887 and ISO 8261). Sampling and transport of samples were done under sterile conditions. Also, coding of samples was done to preserve the anonymity of subjects. After the analysis, the country's soft cheese products compared to imports were more contaminated with S. aureus and E. coli. Meanwhile, the imported butter samples that were analyzed, resulted within norms compared to domestic ones. Based on the results, it was concluded that the microbial quality of samples of cheese, butter and yogurt analyzed remains a real problem for hygiene in Albania. The study will also serve business operators in Albania to improve their work to ensure good hygiene on the basis of the HACCP plan and to provide a guarantee of consumer health.

Keywords: Consumer, health, dairy, by-products, microbial.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 569
657 Predicting Effective Permeability of Nanodielectric Composites Bonded by Soft Magnetic Nanoparticles

Authors: A. Thabet, M. Repetto

Abstract:

Dielectric materials play an important role in broad applications, such as electrical and electromagnetic applications. This research studied the prediction of effective permeability of composite and nanocomposite dielectric materials based on theoretical analysis to specify the effects of embedded magnetic inclusions in enhancing magnetic properties of dielectrics. Effective permeability of Plastics and Glass nanodielectrics have been predicted with adding various types and percentages of magnetic nano-particles (Fe, Ni-Cu, Ni-Fe, MgZn_Ferrite, NiZn_Ferrite) for formulating new nanodielectric magnetic industrial materials. Soft nanoparticles powders that have been used in new nanodielectrics often possess the structure of a particle size in the range of micrometer- to nano-sized grains and magnetic isotropy, e.g., a random distribution of magnetic easy axes of the nanograins. It has been succeeded for enhancing characteristics of new nanodielectric magnetic industrial materials. The results have shown a significant effect of inclusions distribution on the effective permeability of nanodielectric magnetic composites, and so, explained the effect of magnetic inclusions types and their concentration on the effective permeability of nanodielectric magnetic materials.

Keywords: Nanoparticles, Nanodielectrics, Nanocomposites, Effective Permeability, Magnetic Properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2714
656 An Efficient Algorithm for Computing all Program Forward Static Slices

Authors: Jehad Al Dallal

Abstract:

Program slicing is the task of finding all statements in a program that directly or indirectly influence the value of a variable occurrence. The set of statements that can affect the value of a variable at some point in a program is called a program backward slice. In several software engineering applications, such as program debugging and measuring program cohesion and parallelism, several slices are computed at different program points. The existing algorithms for computing program slices are introduced to compute a slice at a program point. In these algorithms, the program, or the model that represents the program, is traversed completely or partially once. To compute more than one slice, the same algorithm is applied for every point of interest in the program. Thus, the same program, or program representation, is traversed several times. In this paper, an algorithm is introduced to compute all forward static slices of a computer program by traversing the program representation graph once. Therefore, the introduced algorithm is useful for software engineering applications that require computing program slices at different points of a program. The program representation graph used in this paper is called Program Dependence Graph (PDG).

Keywords: Program slicing, static slicing, forward slicing, program dependence graph (PDG).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419