Search results for: two heterogeneous servers
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 803

Search results for: two heterogeneous servers

653 Multifluid Computational Fluid Dynamics Simulation for Sawdust Gasification inside an Industrial Scale Fluidized Bed Gasifier

Authors: Vasujeet Singh, Pruthiviraj Nemalipuri, Vivek Vitankar, Harish Chandra Das

Abstract:

For the correct prediction of thermal and hydraulic performance (bed voidage, suspension density, pressure drop, heat transfer, and combustion kinetics), one should incorporate the correct parameters in the computational fluid dynamics simulation of a fluidized bed gasifier. Scarcity of fossil fuels, and to fulfill the energy demand of the increasing population, researchers need to shift their attention to the alternative to fossil fuels. The current research work focuses on hydrodynamics behavior and gasification of sawdust inside a 2D industrial scale FBG using the Eulerian-Eulerian multifluid model. The present numerical model is validated with experimental data. Further, this model extended for the prediction of gasification characteristics of sawdust by incorporating eight heterogeneous moisture release, volatile cracking, tar cracking, tar oxidation, char combustion, CO₂ gasification, steam gasification, methanation reaction, and five homogeneous oxidation of CO, CH₄, H₂, forward and backward water gas shift (WGS) reactions. In the result section, composition of gasification products is analyzed, along with the hydrodynamics of sawdust and sand phase, heat transfer between the gas, sand and sawdust, reaction rates of different homogeneous and heterogeneous reactions is being analyzed along the height of the domain.

Keywords: devolatilization, Eulerian-Eulerian, fluidized bed gasifier, mathematical modelling, sawdust gasification

Procedia PDF Downloads 67
652 Performance Evaluation of Soft RoCE over 1 Gigabit Ethernet

Authors: Gurkirat Kaur, Manoj Kumar, Manju Bala

Abstract:

Ethernet is the most influential and widely used technology in the world. With the growing demand of low latency and high throughput technologies like InfiniBand and RoCE, unique features viz. RDMA (Remote Direct Memory Access) have evolved. RDMA is an effective technology which is used for reducing system load and improving performance. InfiniBand is a well known technology which provides high-bandwidth and low-latency and makes optimal use of in-built features like RDMA. With the rapid evolution of InfiniBand technology and Ethernet lacking the RDMA and zero copy protocol, the Ethernet community has came out with a new enhancements that bridges the gap between InfiniBand and Ethernet. By adding the RDMA and zero copy protocol to the Ethernet a new networking technology is evolved, called RDMA over Converged Ethernet (RoCE). RoCE is a standard released by the IBTA standardization body to define RDMA protocol over Ethernet. With the emergence of lossless Ethernet, RoCE uses InfiniBand’s efficient transport to provide the platform for deploying RDMA technology in mainstream data centres over 10GigE, 40GigE and beyond. RoCE provide all of the InfiniBand benefits transport benefits and well established RDMA ecosystem combined with converged Ethernet. In this paper, we evaluate the heterogeneous Linux cluster, having multi nodes with fast interconnects i.e. gigabit Ethernet and Soft RoCE. This paper presents the heterogeneous Linux cluster configuration and evaluates its performance using Intel’s MPI Benchmarks. Our result shows that Soft RoCE is performing better than Ethernet in various performance metrics like bandwidth, latency and throughput.

Keywords: ethernet, InfiniBand, RoCE, RDMA, MPI, Soft RoCE

Procedia PDF Downloads 434
651 Autonomic Recovery Plan with Server Virtualization

Authors: S. Hameed, S. Anwer, M. Saad, M. Saady

Abstract:

For autonomic recovery with server virtualization, a cogent plan that includes recovery techniques and backups with virtualized servers can be developed instead of assigning an idle server to backup operations. In addition to hardware cost reduction and data center trail, the disaster recovery plan can ensure system uptime and to meet objectives of high availability, recovery time, recovery point, server provisioning, and quality of services. This autonomic solution would also support disaster management, testing, and development of the recovery site. In this research, a workflow plan is proposed for supporting disaster recovery with virtualization providing virtual monitoring, requirements engineering, solution decision making, quality testing, and disaster management. This recovery model would make disaster recovery a lot easier, faster, and less error prone.

Keywords: autonomous intelligence, disaster recovery, cloud computing, server virtualization

Procedia PDF Downloads 135
650 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems

Authors: Yong-Kyu Jung

Abstract:

The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).

Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity

Procedia PDF Downloads 50
649 Load Balancing Algorithms for SIP Server Clusters in Cloud Computing

Authors: Tanmay Raj, Vedika Gupta

Abstract:

For its groundbreaking and substantial power, cloud computing is today’s most popular breakthrough. It is a sort of Internet-based computing that allows users to request and receive numerous services in a cost-effective manner. Virtualization, grid computing, and utility computing are the most widely employed emerging technologies in cloud computing, making it the most powerful. However, cloud computing still has a number of key challenges, such as security, load balancing, and non-critical failure adaption, to name a few. The massive growth of cloud computing will put an undue strain on servers. As a result, network performance will deteriorate. A good load balancing adjustment can make cloud computing more productive and in- crease client fulfillment execution. Load balancing is an important part of cloud computing because it prevents certain nodes from being overwhelmed while others are idle or have little work to perform. Response time, cost, throughput, performance, and resource usage are all parameters that may be improved using load balancing.

Keywords: cloud computing, load balancing, computing, SIP server clusters

Procedia PDF Downloads 87
648 Comparative Study of Globalization and Homogenous Society: South Korea and Greek Society Reaction to Foreign Culture

Authors: Putri Mentari Racharjo

Abstract:

The development of current technology is simplifying globalization process. An easier globalization process and mobilization are increasing interactions among individuals and societies in different countries. It is also easier for foreign culture to enter a country and create changes to the society. Differences brought by foreign culture will most likely affect any society. It will be easier for heterogeneous society to accept new culture, considering that they have various cultures, and they are used to differences. So it will be easier for a heterogeneous society to accept new culture as long as the culture is not contrary to their essential values. However for a homogenous society, where they have only one language and culture, it will take a longer adjustment time to fully accept the new culture. There will be a tendency for homogenous societies to react in a more negative way to new culture. Greece and South Korea are the examples for homogeneous societies. Greece, a destination country for immigrants, is having a hard time adjusting themselves to accept many immigrants with many cultures. There are various discrimination cases of immigrants in Greece, when the Greek society cannot fully accept the new culture brought by immigrants. South Korea, a newly popular country with K-pop and K-dramas, is attracting people from all over the world to come to South Korea. However a homogenous South Korean society is also having a hard time to fully accept foreign cultures, resulting in many discrimination cases based on race and culture in South Korea. With a qualitative method through a case study and literature review, this article will discuss about Greek and South Korean societies reaction to new cultures as an effect of globalization.

Keywords: foreign culture, globalization, greece, homogenous society, South Korea

Procedia PDF Downloads 309
647 Deploying a Platform as a Service Cloud Solution to Support Student Learning

Authors: Jiangping Wang

Abstract:

This presentation describes the design and implementation of PaaS (platform as a service) cloud-based labs that are used in database-related courses to teach students practical skills. Traditionally, all labs are implemented in a desktop-based environment where students have to install heavy client software to access database servers. In order to release students from that burden, we have successfully deployed the cloud-based solution to support database-related courses, from which students and teachers can practice and learn database topics in various database courses via cloud access. With its development environment, execution runtime, web server, database server, and collaboration capability, it offers a shared pool of configurable computing resources and comprehensive environment that supports students’ needs without the complexity of maintaining the infrastructure.

Keywords: PaaS, database environment, e-learning, web server

Procedia PDF Downloads 236
646 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses

Authors: Neil Bar, Andrew Heweston

Abstract:

Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.

Keywords: probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability

Procedia PDF Downloads 189
645 Experimental Study of the Dynamics of Sediments in Natural Channels in a Non-Stationary Flow Regime

Authors: Fourar Ali, Fourar Fatima Zohra

Abstract:

Knowledge of sediment characteristics is fundamental to understanding their sedimentary functioning: sedimentation, settlement, and erosion processes of cohesive sediments are controlled by complex interactions between physical, chemical, and biological factors. Sediment transport is of primary importance in river hydraulics and river engineering. Indeed, the displacement of sediments can lead to lasting modifications of the bed in terms of its elevation, slope and roughness. The protection of a bank, for example, is likely to initiate a local incision of the river bed, which, in turn, can lead to the subsidence of the bank. The flows in the natural environment occur in general with heterogeneous boundary conditions because of the distribution of the roughnesses of the fixed or mobile bottoms and of the important deformations of the free surface, especially for the flows with a weak draft considering the irregularity of the bottom. Bedforms significantly influence flow resistance. The arrangement of particles lining the bottom of the stream bed or experimental channel generates waveforms of different sizes that lead to changes in roughness and consequently spatial variability in the turbulent characteristics of the flow. The study which is focused on the laws of friction in alluvial beds, aims to analyze the characteristics of flows and materials constituting the natural channels. Experimental results were obtained by simulating these flows on a rough bottom in an experimental channel at the Hydraulics Laboratory of the University of Batna 2. The system of equations governing the problem is solved using the program named: CLIPPER.5 and ACP.

Keywords: free surface flow, heterogeneous sand, moving bottom bed, friction coefficient, bottom roughness

Procedia PDF Downloads 52
644 Making Social Accountability Initiatives Work in the Performance of Local Self-Governing Institutions: District-Level Analysis in Rural Assam, India

Authors: Pankaj Kumar Kalita

Abstract:

Ineffectiveness of formal institutional mechanisms such as official audit to improve public service delivery has been a serious concern to scholars working on governance reforms in developing countries. Scholars argue that public service delivery in local self-governing institutions can be improved through application of informal mechanisms such as social accountability. Social accountability has been reinforced with the engagement of citizens and civic organizations in the process of service delivery to reduce the governance gap in developing countries. However, there are challenges that may impede the scope of establishing social accountability initiatives in the performance of local self-governing institutions. This study makes an attempt to investigate the factors that may impede the scope of establishing social accountability, particularly in culturally heterogeneous societies like India. While analyzing the implementation of two rural development schemes by Panchayats, the local self-governing institutions functioning in rural Assam in India, this study argues that the scope of establishing social accountability in the performance of local self-governing institutions, particularly in culturally heterogeneous societies in developing countries will be impeded by the absence of inter-caste and inter-religion networks. Data has been collected from five selected districts of Assam using in-depth interview method and survey method. The study further contributes to the debates on 'good governance' and citizen-centric approaches in developing countries.

Keywords: citizen engagement, local self-governing institutions, networks, social accountability

Procedia PDF Downloads 289
643 Task Scheduling and Resource Allocation in Cloud-based on AHP Method

Authors: Zahra Ahmadi, Fazlollah Adibnia

Abstract:

Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).

Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow

Procedia PDF Downloads 120
642 In Silico Study of the Biological and Pharmacological Activity of Nigella sativa

Authors: Ammar Ouahab, Meriem Houichi , Sanna Mihoubi

Abstract:

Background: Nigella sativa is an annual flowering plant, belongs to the Ranunculaceae family. It has many pharmacological activities such as anti-inflammatory; anti-bacterial; anti-hepatotoxic activities etc. Materials: In order to predict the pharmacological activity of Nigella Sativa’s compounds, some web based servers were used, namely, PubChem, Molinspiration, ADMET-SAR, PASS online and PharMapper. In addition to that, AutoDOCK was used to investigate the different molecular interactions between the selected compounds and their target proteins. Results: All compounds displayed a stable interaction with their targets and satisfactory binding energies, which means that they are active on their targets. Conclusion: Nigella sativa is an effective medicinal plant that has several ethno-medical uses; the latter uses are proven herein via an in-silico study of their pharmacological activities.

Keywords: Nigella sativa, AutoDOCK, PubChem, Molinspiration, ADMET-SAR, PharMapper, PASS online server, docking

Procedia PDF Downloads 105
641 Multi-Scale Modeling of Ti-6Al-4V Mechanical Behavior: Size, Dispersion and Crystallographic Texture of Grains Effects

Authors: Fatna Benmessaoud, Mohammed Cheikh, Vencent Velay, Vanessa Vidal, Farhad Rezai-Aria, Christine Boher

Abstract:

Ti-6Al-4V titanium alloy is one of the most widely used materials in aeronautical and aerospace industries. Because of its high specific strength, good fatigue, and corrosion resistance, this alloy is very suitable for moderate temperature applications. At room temperature, Ti-6Al-4V mechanical behavior is generally controlled by the behavior of alpha phase (beta phase percent is less than 8%). The plastic strain of this phase notably based on crystallographic slip can be hindered by various obstacles and mechanisms (crystal lattice friction, sessile dislocations, strengthening by solute atoms and grain boundaries…). The grains aspect of alpha phase (its morphology and texture) and the nature of its crystallographic lattice (which is hexagonal compact) give to plastic strain heterogeneous, discontinuous and anisotropic characteristics at the local scale. The aim of this work is to develop a multi-scale model for Ti-6Al-4V mechanical behavior using crystal plasticity approach; this multi-scale model is used then to investigate grains size, dispersion of grains size, crystallographic texture and slip systems activation effects on Ti-6Al-4V mechanical behavior under monotone quasi-static loading. Nine representative elementary volume (REV) are built for taking into account the physical elements (grains size, dispersion and crystallographic) mentioned above, then boundary conditions of tension test are applied. Finally, simulation of the mechanical behavior of Ti-6Al-4V and study of slip systems activation in alpha phase is reported. The results show that the macroscopic mechanical behavior of Ti-6Al-4V is strongly linked to the active slip systems family (prismatic, basal or pyramidal). The crystallographic texture determines which family of slip systems can be activated; therefore it gives to the plastic strain a heterogeneous character thus an anisotropic macroscopic mechanical behavior of Ti-6Al-4V alloy modeled. The grains size influences also on mechanical proprieties of Ti-6Al-4V, especially on the yield stress; by decreasing of the grain size, the yield strength increases. Finally, the grains' distribution which characterizes the morphology aspect (homogeneous or heterogeneous) gives to the deformation fields distribution enough heterogeneity because the crystallographic slip is easier in large grains compared to small grains, which generates a localization of plastic deformation in certain areas and a concentration of stresses in others.

Keywords: multi-scale modeling, Ti-6Al-4V alloy, crystal plasticity, grains size, crystallographic texture

Procedia PDF Downloads 134
640 An Approach to Correlate the Statistical-Based Lorenz Method, as a Way of Measuring Heterogeneity, with Kozeny-Carman Equation

Authors: H. Khanfari, M. Johari Fard

Abstract:

Dealing with carbonate reservoirs can be mind-boggling for the reservoir engineers due to various digenetic processes that cause a variety of properties through the reservoir. A good estimation of the reservoir heterogeneity which is defined as the quality of variation in rock properties with location in a reservoir or formation, can better help modeling the reservoir and thus can offer better understanding of the behavior of that reservoir. Most of reservoirs are heterogeneous formations whose mineralogy, organic content, natural fractures, and other properties vary from place to place. Over years, reservoir engineers have tried to establish methods to describe the heterogeneity, because heterogeneity is important in modeling the reservoir flow and in well testing. Geological methods are used to describe the variations in the rock properties because of the similarities of environments in which different beds have deposited in. To illustrate the heterogeneity of a reservoir vertically, two methods are generally used in petroleum work: Dykstra-Parsons permeability variations (V) and Lorenz coefficient (L) that are reviewed briefly in this paper. The concept of Lorenz is based on statistics and has been used in petroleum from that point of view. In this paper, we correlated the statistical-based Lorenz method to a petroleum concept, i.e. Kozeny-Carman equation and derived the straight line plot of Lorenz graph for a homogeneous system. Finally, we applied the two methods on a heterogeneous field in South Iran and discussed each, separately, with numbers and figures. As expected, these methods show great departure from homogeneity. Therefore, for future investment, the reservoir needs to be treated carefully.

Keywords: carbonate reservoirs, heterogeneity, homogeneous system, Dykstra-Parsons permeability variations (V), Lorenz coefficient (L)

Procedia PDF Downloads 190
639 Designing an MTB-MLE for Linguistically Heterogenous Contexts: A Practitioner’s Perspective

Authors: Ajay Pinjani, Minha Khan, Ayesha Mehkeri, Anum Iftikhar

Abstract:

There is much research available on the benefits of adopting mother tongue-based multilingual education (MTB MLE) in primary school classrooms, but there is limited guidance available on how to design such programs for low-resource and linguistically diverse contexts. This paper is an effort to bridge the gap between theory and practice by offering a practitioner’s perspective on designing an MTB MLE program for linguistically heterogeneous contexts. The research compounds findings from current academic literature on MTB MLE, the study of global MTB MLE programs, interviews with practitioners, policy-makers, and academics worldwide, and a socio-linguistic survey carried out in parts of Tharparkar, Pakistan, the area selected for envisioned pilot implementation. These findings enabled the creation of ‘guiding principles’ which provide structure for the development of a contextualized and holistic MTB-MLE program. The guiding principles direct the creation of teaching and learning materials, creating effective teaching and learning environment, community engagement, and program evaluation. Additionally, the paper demonstrates the development of a context-specific language ladder framework which outlines the language journey of a child’s education, beginning with the mother tongue/ most familiar language in the early years and then gradually transitioning into other languages. Both the guiding principles and language ladder can be adapted to any multilingual context. Thus, this research provides MTB MLE practitioners with assistance in developing an MTB MLE model, which is best suited for their context.

Keywords: mother tongue based multilingual education, education design, language ladder, language issues, heterogeneous contexts

Procedia PDF Downloads 84
638 Transesterification of Waste Cooking Oil for Biodiesel Production Using Modified Clinoptilolite Zeolite as a Heterogeneous Catalyst

Authors: D. Mowla, N. Rasti, P. Keshavarz

Abstract:

Reduction of fossil fuels sources, increasing of pollution gases emission, and global warming effects increase the demand of renewable fuels. One of the main candidates of alternative fuels is biodiesel. Biodiesel limits greenhouse gas effects due to the closed CO2 cycle. Biodiesel has more biodegradability, lower combustion emissions such as CO, SOx, HC, PM and lower toxicity than petro diesel. However, biodiesel has high production cost due to high price of plant oils as raw material. So, the utilization of waste cooking oils (WCOs) as feedstock, due to their low price and disposal problems reduce biodiesel production cost. In this study, production of biodiesel by transesterification of methanol and WCO using modified sodic potassic (SP) clinoptilolite zeolite and sodic potassic calcic (SPC) clinoptilolite zeolite as heterogeneous catalysts have been investigated. These natural clinoptilolite zeolites were modified by KOH solution to increase the site activity. The optimum biodiesel yields for SP clinoptilolite and SPC clinoptilolite were 95.8% and 94.8%, respectively. Produced biodiesel were analyzed and compared with petro diesel and ASTM limits. The properties of produced biodiesel confirm well with ASTM limits. The density, kinematic viscosity, cetane index, flash point, cloud point, and pour point of produced biodiesel were all higher than petro diesel but its acid value was lower than petro diesel. Finally, the reusability and regeneration of catalysts were investigated. The results indicated that the spent zeolites cannot be reused directly for the transesterification, but they can be regenerated easily and can obtain high activity.

Keywords: biodiesel, renewable fuel, transesterification, waste cooking oil

Procedia PDF Downloads 208
637 An Agent-Based Model of Innovation Diffusion Using Heterogeneous Social Interaction and Preference

Authors: Jang kyun Cho, Jeong-dong Lee

Abstract:

The advent of the Internet, mobile communications, and social network services has stimulated social interactions among consumers, allowing people to affect one another’s innovation adoptions by exchanging information more frequently and more quickly. Previous diffusion models, such as the Bass model, however, face limitations in reflecting such recent phenomena in society. These models are weak in their ability to model interactions between agents; they model aggregated-level behaviors only. The agent based model, which is an alternative to the aggregate model, is good for individual modeling, but it is still not based on an economic perspective of social interactions so far. This study assumes the presence of social utility from other consumers in the adoption of innovation and investigates the effect of individual interactions on innovation diffusion by developing a new model called the interaction-based diffusion model. By comparing this model with previous diffusion models, the study also examines how the proposed model explains innovation diffusion from the perspective of economics. In addition, the study recommends the use of a small-world network topology instead of cellular automata to describe innovation diffusion. This study develops a model based on individual preference and heterogeneous social interactions using utility specification, which is expandable and, thus, able to encompass various issues in diffusion research, such as reservation price. Furthermore, the study proposes a new framework to forecast aggregated-level market demand from individual level modeling. The model also exhibits a good fit to real market data. It is expected that the study will contribute to our understanding of the innovation diffusion process through its microeconomic theoretical approach.

Keywords: innovation diffusion, agent based model, small-world network, demand forecasting

Procedia PDF Downloads 312
636 Heterogeneous Photocatalytic Degradation of Ibuprofen in Ultrapure Water, Municipal and Pharmaceutical Industry Wastewaters Using a TiO2/UV-LED System

Authors: Nabil Jallouli, Luisa M. Pastrana-Martínez, Ana R. Ribeiro, Nuno F. F. Moreira, Joaquim L. Faria, Olfa Hentati, Adrián M. T. Silva, Mohamed Ksibi

Abstract:

Degradation and mineralization of ibuprofen (IBU) were investigated using Ultraviolet (UV) Light Emitting Diodes (LEDs) in TiO2 photocatalysis. Samples of ultrapure water (UP) and a secondary treated effluent of a municipal wastewater treatment plant (WWTP), both spiked with IBU, as well as a highly concentrated IBU (230 mgL-1) pharmaceutical industry wastewater (PIWW), were tested in the TiO2/UV-LED system. Three operating parameters, namely, pH, catalyst load and number of LEDs were optimized. The process efficiency was evaluated in terms of IBU removal using high performance liquid chromatography (HPLC) and ultra-high performance liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS). Additionally, the mineralization was investigated by determining the dissolved organic carbon (DOC) content. The chemical structures of transformation products were proposed based on the data obtained using liquid chromatography with a high resolution mass spectrometer ion trap/time-of-flight (LC-MS-IT-TOF). A possible pathway of IBU degradation was accordingly proposed. Bioassays were performed using the marine bacterium Vibrio fischeri to evaluate the potential acute toxicity of original and treated wastewaters. TiO2 heterogeneous photocatalysis was efficient to remove IBU from UP and from PIWW, and less efficient in treating the wastewater from the municipal WWTP. The acute toxicity decreased by ca. 40% after treatment, regardless of the studied matrix.

Keywords: acute toxicity, Ibuprofen, UV-LEDs, wastewaters

Procedia PDF Downloads 224
635 Importance of Ethics in Cloud Security

Authors: Pallavi Malhotra

Abstract:

This paper examines the importance of ethics in cloud computing. In the modern society, cloud computing is offering individuals and businesses an unlimited space for storing and processing data or information. Most of the data and information stored in the cloud by various users such as banks, doctors, architects, engineers, lawyers, consulting firms, and financial institutions among others require a high level of confidentiality and safeguard. Cloud computing offers centralized storage and processing of data, and this has immensely contributed to the growth of businesses and improved sharing of information over the internet. However, the accessibility and management of data and servers by a third party raise concerns regarding the privacy of clients’ information and the possible manipulations of the data by third parties. This document suggests the approaches various stakeholders should take to address various ethical issues involving cloud-computing services. Ethical education and training is key to all stakeholders involved in the handling of data and information stored or being processed in the cloud.

Keywords: IT ethics, cloud computing technology, cloud privacy and security, ethical education

Procedia PDF Downloads 303
634 Efficient Utilization of Commodity Computers in Academic Institutes: A Cloud Computing Approach

Authors: Jasraj Meena, Malay Kumar, Manu Vardhan

Abstract:

Cloud computing is a new technology in industry and academia. The technology has grown and matured in last half decade and proven their significant role in changing environment of IT infrastructure where cloud services and resources are offered over the network. Cloud technology enables users to use services and resources without being concerned about the technical implications of technology. There are substantial research work has been performed for the usage of cloud computing in educational institutes and majority of them provides cloud services over high-end blade servers or other high-end CPUs. However, this paper proposes a new stack called “CiCKAStack” which provide cloud services over unutilized computing resources, named as commodity computers. “CiCKAStack” provides IaaS and PaaS using underlying commodity computers. This will not only increasing the utilization of existing computing resources but also provide organize file system, on demand computing resource and design and development environment.

Keywords: commodity computers, cloud-computing, KVM, CloudStack, AppScale

Procedia PDF Downloads 238
633 Cytogenetic Characterization of the VERO Cell Line Based on Comparisons with the Subline; Implication for Authorization and Quality Control of Animal Cell Lines

Authors: Fumio Kasai, Noriko Hirayama, Jorge Pereira, Azusa Ohtani, Masashi Iemura, Malcolm A. Ferguson Smith, Arihiro Kohara

Abstract:

The VERO cell line was established in 1962 from normal tissue of an African green monkey, Chlorocebus aethiops (2n=60), and has been commonly used worldwide for screening for toxins or as a cell substrate for the production of viral vaccines. The VERO genome was sequenced in 2014; however, its cytogenetic features have not been fully characterized as it contains several chromosome abnormalities and different karyotypes coexist in the cell line. In this study, the VERO cell line (JCRB0111) was compared with one of the sublines. In contrast to 59 chromosomes as the modal chromosome number in the VERO cell line, the subline had two peaks of 56 and 58 chromosomes. M-FISH analysis using human probes revealed that the VERO cell line was characterized by a translocation t(2;25) found in all metaphases, which was absent in the subline. Different abnormalities detected only in the subline show that the cell line is heterogeneous, indicating that the subline has the potential to change its genomic characteristics during cell culture. The various alterations in the two independent lineages suggest that genomic changes in both VERO cells can be accounted for by progressive rearrangements during their evolution in culture. Both t(5;X) and t(8;14) observed in all metaphases of the two cell lines might have a key role in VERO cells and could be used as genetic markers to identify VERO cells. The flow karyotype shows distinct differences from normal. Further analysis of sorted abnormal chromosomes may uncover other characteristics of VERO cells. Because of the absence of STR data, cytogenetic data are important in characterizing animal cell lines and can be an indicator of their quality control.

Keywords: VERO, cell culture passage, chromosome rearrangement, heterogeneous cells

Procedia PDF Downloads 388
632 Network Functions Virtualization-Based Virtual Routing Function Deployment under Network Delay Constraints

Authors: Kenichiro Hida, Shin-Ichi Kuribayashi

Abstract:

NFV-based network implements a variety of network functions with software on general-purpose servers, and this allows the network operator to select any capabilities and locations of network functions without any physical constraints. In this paper, we evaluate the influence of the maximum tolerable network delay on the virtual routing function deployment guidelines which the authors proposed previously. Our evaluation results have revealed the following: (1) the more the maximum tolerable network delay condition becomes severe, the more the number of areas where the route selection function is installed increases and the total network cost increases, (2) the higher the routing function cost relative to the circuit bandwidth cost, the increase ratio of total network cost becomes larger according to the maximum tolerable network delay condition.

Keywords: NFV (Network Functions Virtualization), resource allocation, virtual routing function, minimum total network cost

Procedia PDF Downloads 215
631 An Experimental Testbed Using Virtual Containers for Distributed Systems

Authors: Parth Patel, Ying Zhu

Abstract:

Distributed systems have become ubiquitous, and they continue their growth through a range of services. With advances in resource virtualization technology such as Virtual Machines (VM) and software containers, developers no longer require high-end servers to test and develop distributed software. Even in commercial production, virtualization has streamlined the process of rapid deployment and service management. This paper introduces a distributed systems testbed that utilizes virtualization to enable distributed systems development on commodity computers. The testbed can be used to develop new services, implement theoretical distributed systems concepts for understanding, and experiment with virtual network topologies. We show its versatility through two case studies that utilize the testbed for implementing a theoretical algorithm and developing our own methodology to find high-risk edges. The results of using the testbed for these use cases have proven the effectiveness and versatility of this testbed across a range of scenarios.

Keywords: distributed systems, experimental testbed, peer-to-peer networks, virtual container technology

Procedia PDF Downloads 111
630 FLEX: A Backdoor Detection and Elimination Method in Federated Scenario

Authors: Shuqi Zhang

Abstract:

Federated learning allows users to participate in collaborative model training without sending data to third-party servers, reducing the risk of user data privacy leakage, and is widely used in smart finance and smart healthcare. However, the distributed architecture design of federation learning itself and the existence of secure aggregation protocols make it inherently vulnerable to backdoor attacks. To solve this problem, the federated learning backdoor defense framework FLEX based on group aggregation, cluster analysis, and neuron pruning is proposed, and inter-compatibility with secure aggregation protocols is achieved. The good performance of FLEX is verified by building a horizontal federated learning framework on the CIFAR-10 dataset for experiments, which achieves 98% success rate of backdoor detection and reduces the success rate of backdoor tasks to 0% ~ 10%.

Keywords: federated learning, secure aggregation, backdoor attack, cluster analysis, neuron pruning

Procedia PDF Downloads 62
629 Meanings and Concepts of Standardization in Systems Medicine

Authors: Imme Petersen, Wiebke Sick, Regine Kollek

Abstract:

In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.

Keywords: data, science and technology studies (STS), standardization, systems medicine

Procedia PDF Downloads 310
628 Reliability Analysis of Computer Centre at Yobe State University Nigeria under Different Repair Policies

Authors: Vijay Vir Singh

Abstract:

In this paper, we focus on the reliability and performance analysis of Computer Centre (CC) at Yobe State University, Damaturu, Nigeria. The CC consists of three servers: one database mail server, one redundant and one for sharing with the client computers in the CC (called as local server). Observing the different possibilities of functioning of the CC, analysis has been done to evaluate the various reliability characteristics of the system. The system can completely fail due to failure of router, redundant server before repairing the mail server, and switch failure. The system can also partially fail when local server fails. The system can also fail completely due to a cooling failure, electricity failure or some natural calamity like earthquake, fire etc. All the failure rates are assumed to be constant while repair follows two types of distributions: general and Gumbel-Hougaard family copula.

Keywords: reliability, availability Gumbel-Hougaard family copula, MTTF, internet data centre

Procedia PDF Downloads 440
627 Classification of IoT Traffic Security Attacks Using Deep Learning

Authors: Anum Ali, Kashaf ad Dooja, Asif Saleem

Abstract:

The future smart cities trend will be towards Internet of Things (IoT); IoT creates dynamic connections in a ubiquitous manner. Smart cities offer ease and flexibility for daily life matters. By using small devices that are connected to cloud servers based on IoT, network traffic between these devices is growing exponentially, whose security is a concerned issue, since ratio of cyber attack may make the network traffic vulnerable. This paper discusses the latest machine learning approaches in related work further to tackle the increasing rate of cyber attacks, machine learning algorithm is applied to IoT-based network traffic data. The proposed algorithm train itself on data and identify different sections of devices interaction by using supervised learning which is considered as a classifier related to a specific IoT device class. The simulation results clearly identify the attacks and produce fewer false detections.

Keywords: IoT, traffic security, deep learning, classification

Procedia PDF Downloads 123
626 Proof of Concept Design and Development of a Computer-Aided Medical Evaluation of Symptoms Web App: An Expert System for Medical Diagnosis in General Practice

Authors: Ananda Perera

Abstract:

Computer-Assisted Medical Evaluation of Symptoms (CAMEOS) is a medical expert system designed to help General Practices (GPs) make an accurate diagnosis. CAMEOS comprises a knowledge base, user input, inference engine, reasoning module, and output statement. The knowledge base was developed by the author. User input is an Html file. The physician user collects data in the consultation. Data is sent to the inference engine at servers. CAMEOS uses set theory to simulate diagnostic reasoning. The program output is a list of differential diagnoses, the most probable diagnosis, and the diagnostic reasoning.

Keywords: CDSS, computerized decision support systems, expert systems, general practice, diagnosis, diagnostic systems, primary care diagnostic system, artificial intelligence in medicine

Procedia PDF Downloads 127
625 Adaptive Certificate-Based Mutual Authentication Protocol for Mobile Grid Infrastructure

Authors: H. Parveen Begam, M. A. Maluk Mohamed

Abstract:

Mobile Grid Computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using different types of electronic portable devices. In a grid environment the security issues are like authentication, authorization, message protection and delegation handled by GSI (Grid Security Infrastructure). Proving better security between mobile devices and grid infrastructure is a major issue, because of the open nature of wireless networks, heterogeneous and distributed environments. In a mobile grid environment, the individual computing devices may be resource-limited in isolation, as an aggregated sum, they have the potential to play a vital role within the mobile grid environment. Some adaptive methodology or solution is needed to solve the issues like authentication of a base station, security of information flowing between a mobile user and a base station, prevention of attacks within a base station, hand-over of authentication information, communication cost of establishing a session key between mobile user and base station, computing complexity of achieving authenticity and security. The sharing of resources of the devices can be achieved only through the trusted relationships between the mobile hosts (MHs). Before accessing the grid service, the mobile devices should be proven authentic. This paper proposes the dynamic certificate based mutual authentication protocol between two mobile hosts in a mobile grid environment. The certificate generation process is done by CA (Certificate Authority) for all the authenticated MHs. Security (because of validity period of the certificate) and dynamicity (transmission time) can be achieved through the secure service certificates. Authentication protocol is built on communication services to provide cryptographically secured mechanisms for verifying the identity of users and resources.

Keywords: mobile grid computing, certificate authority (CA), SSL/TLS protocol, secured service certificates

Procedia PDF Downloads 280
624 The Nexus of Decentralized Policy, social Heterogeneity and Poverty in Equitable Forest Benefit Sharing in the Lowland Community Forestry Program of Nepal

Authors: Dhiraj Neupane

Abstract:

Decentralized policy and practices have largely concentrated on the transformation of decision-making authorities from central to local institutions (or people) in the developing world. Such policy and practices always aimed for the equitable and efficient management of resources in the line of poverty reduction. The transformation of forest decision-making autonomy has also glorified as the best forest management alternatives to maximize the forest benefits and improve the livelihood of local people living nearby the forests. However, social heterogeneity and poor decision-making capacity of local institutions (or people) pose a nexus while managing the resources and sharing the forest benefits among the user households despite the policy objectives. The situation is severe in the lowland of Nepal, where forest resources have higher economic potential and user households have heterogeneous socio-economic conditions. The study discovered that utilizing the power of decision-making autonomy, user households were putting low values of timber considering the equitable access of timber to all user households as it is the most valuable product of community forest. Being the society is heterogeneous by socio-economic conditions, households of better economic conditions were always taking higher amount of forest benefits. The low valuation of timber has negative consequences on equitable benefit sharing and poor support to livelihood improvement of user households. Moreover, low valuation has possibility to increase the local demands of timber and increase the human pressure on forests.

Keywords: decentralized forest policy, Nepal, poverty, social heterogeneity, Terai

Procedia PDF Downloads 260